Michigan is set to become part of a state-led initiative to regulate political ads involving artificial intelligence, while federal legislation on the matter remains pending.

Michigan will be part of a movement to address misleading practices involving artificial intelligence and altered media through policies implemented at the state level. At the same time, Congress and the Federal Elections Commission are discussing broader regulations in preparation for the 2024 elections.

Legislation set to be approved by Governor Gretchen Whitmer, a Democrat, will mandate that state and federal campaigns in Michigan explicitly disclose if their political ads were produced with artificial intelligence. Additionally, it will ban the use of AI-generated deepfakes within 90 days of an election unless they are labeled as manipulated.

Deepfakes are fabricated media that falsely portray an individual as engaging in or stating something they did not actually do or say. These are produced through the use of generative artificial intelligence, a form of AI that can rapidly generate realistic images, videos, or audio recordings.

There is growing worry that generative artificial intelligence could potentially be utilized in the 2024 presidential election to deceive voters, imitate candidates, and disrupt the electoral process at an unprecedented rate.

Candidates and committees in the race already are experimenting with the rapidly advancing technology, which in recent years has become cheaper, faster and easier for the public to use.

In April, the Republican National Committee unveiled an ad created entirely by AI, depicting a potential future for the United States under a second term for President Joe Biden. The ad included authentic-looking images of boarded-up stores, military presence in the streets, and a surge in immigration causing fear. The fine print revealed that the ad was produced using AI technology.

In July, Never Back Down, a political action committee in support of Republican Florida Governor Ron DeSantis, utilized an artificial intelligence voice cloning program to mimic the voice of former President Donald Trump. This created the illusion that Trump himself narrated a post on social media, even though he had never actually spoken the words.

According to experts, these are only small examples of what could happen if political campaigns or external individuals choose to utilize AI deepfakes for more harmful purposes.

To date, several states, such as California, Minnesota, Texas, and Washington, have enacted laws that control the use of deepfakes in political ads. The nonprofit organization Public Citizen reports that comparable laws have also been proposed in Illinois, New Jersey, and New York.

According to the state House Fiscal Agency, Michigan’s laws mandate that any individual, group or organization sharing an advertisement for a candidate must explicitly mention if they have utilized generative AI. This disclosure must be in the same font size as the majority of the text in printed ads and must be visible for “at least four seconds in letters that are as large as the majority of any text” in televised ads.

Within a span of 90 days before the election, any utilization of deepfakes would necessitate an additional statement to the audience stating that the material has been altered to portray speech or behavior that did not actually take place. In the case of a video, the statement must be prominently displayed and shown throughout the entire duration of the video.

If campaigns break the proposed laws, they may be charged with a misdemeanor and could face up to 93 days in prison or a fine of $1,000, or both. The attorney general or the affected candidate can seek help from the appropriate circuit court to address any misleading media.

Members of Congress from both parties have emphasized the significance of creating laws regarding deepfakes in political campaigns and have convened to deliberate on the matter. However, no legislation has been passed by Congress at this time.

A new Senate bill, supported by both Democratic Senator Amy Klobuchar from Minnesota and Republican Senator Josh Hawley from Missouri, would prohibit “materially deceptive” deepfakes related to federal candidates. However, there are exemptions for parody and satire.

In early November, Jocelyn Benson, the Secretary of State for Michigan, traveled to Washington, D.C. to engage in a nonpartisan conversation about AI and elections. She urged senators to approve the federal Deceptive AI Act proposed by Klobuchar and Hawley. Additionally, Benson encouraged senators to advocate for comparable laws in their respective states.

Benson stated in an interview that federal law has limitations in regulating AI at the state and local levels. Additionally, he mentioned that states require federal funding to address the obstacles presented by AI.

Benson suggested that the federal government provide funding for hiring an AI specialist in each state to address the issue of deepfakes and educate voters on how to identify and handle them. This solution would greatly alleviate the challenges currently faced, as handling it independently is not feasible.

In August, the Federal Election Commission initiated a process that could lead to the regulation of AI-generated deepfakes in political advertisements, using its current rules on “fraudulent misrepresentation.” While the commission did allow for public feedback on the petition, presented by Public Citizen, it has not yet made a decision.

Social media corporations have additionally declared certain regulations aimed at reducing the dissemination of damaging deepfakes. Meta, the parent company of Facebook and Instagram, stated earlier this month that political advertisements displayed on their platforms must disclose if they were produced using artificial intelligence. In September, Google also introduced a comparable policy for labeling AI-generated political ads on YouTube and other Google-owned platforms.


The article has been revised to exclude Kentucky from the list of states where similar laws have been proposed. The legislation in Kentucky is currently in the form of a bill request and has not yet been formally introduced.


Swenson provided coverage from New York. Writer Christina A. Cassidy from the Associated Press contributed from Washington.


The AP is aided by various private foundations in their effort to improve their explanatory reporting on elections and democracy. For more information on AP’s democracy initiative, click here. The AP is solely accountable for all of its content.

Source: wral.com