A new legislation with bipartisan support proposes to mandate the identification and labeling of artificial intelligence-generated videos and audio in online platforms.
A new bill presented in the House on Thursday aims to mandate the recognition and tagging of internet visuals, footage, and sound produced with artificial intelligence. This is the most recent attempt to regulate advancing technology, which has the potential to deceive and mislead if not properly utilized.
Artificial intelligence has advanced to the point where it can produce convincing deepfakes that are nearly indistinguishable from reality. These deepfakes have been used to replicate the voice of President Joe Biden, impersonate famous individuals, and mimic world leaders. This has raised concerns about the potential for increased misinformation, sexual exploitation, consumer scams, and a loss of trust in society.
The legislation would mandate AI developers to utilize digital watermarks or metadata to identify content produced through their products, similar to how photo metadata stores information such as the location, time, and settings of a photo. This labeling process would then be implemented by online platforms like TikTok, YouTube, or Facebook to inform their users. The Federal Trade Commission, with guidance from the National Institute of Standards and Technology, a division of the U.S. Department of Commerce, will determine the final details of the proposed regulations.
Individuals who break the suggested regulation may face legal action in the form of civil lawsuits.
According to Rep. Anna Eshoo, a Democratic representative from California’s Silicon Valley, there have been numerous cases of voice manipulation and video deepfakes. She believes that the American people should have the ability to distinguish between real and fake content. Eshoo, along with Republican Rep. Neal Dunn of Florida, has sponsored a bill to address the issue of deepfakes. In her opinion, this problem is obvious and should be tackled as soon as possible.
If approved, the legislation would serve as a supplement to the voluntary agreements made by technology companies and an executive order on Artificial Intelligence (AI) signed by President Biden in autumn of last year. This order instructed the National Institute of Standards and Technology (NIST) and other governmental agencies to establish standards for AI products and also mandated that developers of AI submit details about potential risks of their products.
Eshoo’s proposed legislation is among a handful of potential solutions aimed at addressing concerns surrounding the potential dangers of artificial intelligence. These worries are shared by members of multiple political parties. Some argue in favor of regulations that would safeguard individuals while also promoting the continued progress of AI in various industries such as healthcare and education.
Lawmakers will now examine the bill, but it is unlikely that they will be able to implement substantial regulations for AI before the 2024 election.
“Dunn stated during the announcement of the legislation that while the increase in innovation within artificial intelligence is thrilling, it also carries the risk of causing substantial damage if misused. He believes that mandating the identification of deepfakes is a basic precaution that would protect consumers, children, and national security.”
Many groups pushing for stronger AI regulations praised the bill introduced on Thursday as a step in the right direction. This sentiment was echoed by certain AI developers, including Margaret Mitchell, the lead AI ethics scientist at Hugging Face. Mitchell’s team created a ChatGPT competitor called Bloom. She believes that the bill’s emphasis on watermarking AI content will empower the public to have a say in the impact of generated content on our society.
She stated that in our current society, it is becoming difficult to distinguish between content created by artificial intelligence and it is becoming increasingly difficult to trace the origin of various AI-generated content.
Source: wral.com