Delhi HC asks Centre to file report on issue of deepfakes- Dilli Dehat se

[ad_1]

New Delhi, The Delhi High Court on Thursday directed the Centre to file a status report on the measures the Central government has taken to counter the increasing menace of the deepfake technology.

HT Image
HT Image

Chief Justice Manmohan and Justice Tushar Rao Gedela further asked for the report to highlight measures taken at the government level and whether there would be a high-powered committee to suggest solutions.

The court while hearing two petitions against the non-regulation of deepfake technology called it a “very serious issue” which required to be dealt with by the authorities on a “priority” basis.

“What are you doing? Everyday deepfakes are on the increase… I am glad that industry people have started taking some initiatives and are spreading awareness about it among consumers,” said the bench.

Deepfake technology allows creation of realistic videos, audio recordings and images that could be manipulated to mislead the viewers by superimposing the likeness of one person onto another, altering their words and actions, thereby presenting a false narrative or spreading misinformation.

The court observed that the artificial intelligence could not be prohibited as people needed it.

“We have to remove the negative part of the technology and keep the positive part,” it added.

The high court further underlined the rising number of hoax bomb threats on flights, enquiring if the government had set up an expert committee and, if yes, who were its members.

“We want clean answers. It has to be a serious committee,” the bench said.

The additional solicitor general representing the Centre said the union ministry of Electronics and Information Technology was looking into the issue.

The counsel appearing for one of the petitioners submitted that several countries had enacted a legislation on the issue and India was far behind.

Most deepfakes, he said, were related to women, including nudity, and the authorities were unable to resolve the issue.

The government was stated to have issued general advisories to the intermediaries which were of no use.

The court granted three weeks to the Centre to file the status report and posted the hearing on November 21.

The high court had previously noted that the antidote to AI was technology.

In its response over websites granting access to deepfakes and blocking such platforms, the MeitY had said it was not empowered to monitor any online content on the internet on a suo motu basis.

“Any content/URL/websites on the internet can only be blocked as per the established legal procedure,” it had said.

Journalist Rajat Sharma has filed one of the pleas and sought directions to block public access to applications and software enabling the creation of such content. He said deepfake technology posed a significant threat to various aspects of society, including disinformation campaigns, and undermined the integrity of public discourse and the democratic process.

The other petition has been filed by advocate Chaitanya Rohilla, against deepfakes and the unregulated use of artificial intelligence.

The ministry, said in its reply, that it had taken various steps to address proliferation of harmful applications and illegal content.

In order to ensure an open, safe, trusted and accountable digital ecosystem, the Digital Personal Data Protection Act 2023 has been notified, it said.

Sharma’s PIL said the Centre had made a statement of its intent to formulate regulation for dealing with deepfakes and synthetic content in November 2023, but it was yet to see the light of the day.

The PIL, therefore, sought a direction to the Centre to identify and block public access to the applications, software, platforms and websites enabling the creation of deepfakes.

The plea sought the government to issue a directive to all social media intermediaries to initiate immediate removal of deepfakes on a complaint.

This article was generated from an automated news agency feed without modifications to text.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *