Sun. Dec 22nd, 2024

[ad_1]

The Centre has notified a fact-checking unit to flag ‘misinformation’ about the government, with powers to issue directives. The Supreme Court has stayed the notification to verify the unit’s constitutionality. Mint explains why such a unit could be cause for much conflict.

What did the notification imply?

The fact-checking unit (FCU) was notified under the Centre’s Press Information Bureau (PIB), giving the government body power to flag any information about the government and its affiliates that it believes to be fake or misleading. The powers of the FCU were established in the 2023 amendment of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, which said that if a central fact-checker flags any information as fake, social media platforms will need to remove it within 36 hours. This created concerns around free speech and regulatory overreach.

What happened in court?

A number of press freedom advocates and activists had filed an appeal against the regulation in 2023 at the Bombay High Court, which rejected a stay appeal against it on 13 March. However, on 21 March, the Supreme Court stayed the Bombay order, until the latter had passed a final verdict in the case. The Editors Guild of India, in a statement, said the rule gave the government “absolute power to determine what is fake or not, in respect of its own work, and order take down (of content).” No such FCUs exist in the democratic world and Europe’s Fact-Checking Standards Network is a group of third-party verifiers.

Which fact-checking measures exist as of now?

Most fact-checkers currently operate privately or with specific social media platforms. X uses crowd-sourced flagging of fake content under ‘Community Notes’. Meta, which owns Facebook, Instagram and WhatsApp, has third-party fact-checking moderators. Popular publications also operate independent fact-checking bodies.

Can a govt fact-checker be objective?

Experts say governments could be naturally inclined to use such a body to take down content that may not suit their agenda. Lawyers say using legal clauses further threatens social media platforms into obeying such orders—which in turn could directly clash with the platforms’ own policies, and with the fundamental idea of free speech. A Centre-affiliated FCU may also not have its own checks and balances, and could lead to a lack of transparency in terms of how or why a news item is flagged as inaccurate in the first place.

Does AI have a role to play in all this?

Yes. AI-generated content and deepfake images and videos threaten to make it really difficult to understand what is real, and what isn’t. This could create significant issues in election times—when nearly half of the world’s population is expected to vote for their next government, including India and the US. Some experts believe allowing the government to operate such an FCU in an election year would be detrimental to how foreign investors view freedom of expression in India—a major developing global economy.

[ad_2]

Source link