The increased use of generative AI has prompted concern across the United States, leading states to hurriedly craft laws either barring the use altogether of these tools or requiring disclosures in political advertisements. In some cases, because these laws are so new, some practitioners may be unaware they even exist. So your clients don’t run afoul of AI statutes, here’s what you need to know about the use of AI in political advertising in 2024:
The state of Utah provides us with a definition of generative AI as “technology that is capable of creating content such as text, audio, image, or video based on patterns learned from large volumes of data rather than being explicitly programmed with rules.” In other words, generative AI is technology that allows users to generate copies and impersonations of faces and voices that can be nearly impossible to distinguish from their real-life counterparts.
When generative AI is used to create audio or visual content, many states use the term “synthetic media” to describe the generated content. Other states use the term “deep fake” when referring to a video made using generative AI.
States with statutes currently addressing the use of generative AI in elections include Michigan, Utah, Wisconsin, Texas, Idaho, New York, Arizona, Oregon, and New Mexico. Other states, including New Hampshire and Massachusetts, are working on enacting similar statutes. Florida’s legislation addressing generative AI is effective as of July 1, 2024.
A common approach to addressing the use of generative AI in political advertisements is the inclusion of disclosures. The majority of states that have enacted statutes addressing the use of generative AI only require an audio or text disclosure stating the advertisement includes content created using generative AI. In Utah, for example, the content of the disclosure depends on the type of content created using the generative AI. If the advertisement or communication includes only synthetic visual media, a disclosure stating, “This video content generated by AI” is required. For advertisements or communications including only audio synthetic media, the statement “This audio content generated by AI” is required, and so on.
Florida has taken a similar approach to the use of generative AI in elections. The Florida statute requires disclosures on political advertisements that portray a person performing some action that did not occur, created with the intent to injure. The political advertisement must state, “Created in whole or in part with the use of generative artificial intelligence (AI).” Any person who pays for, sponsors, or approves a political advertisement that requires a disclosure but does not include one commits a first-degree misdemeanor.
While the disclosure approach appears to be most prevalent, some states have used a different method. Texas, for example, imposes criminal penalties when a person publishes a “deep fake” video within 30 days of an election “with intent to injure a candidate or influence the result of an election.” The constitutionality of these statutes, however, remains to be seen. One Texas court has already held this statute to be unconstitutional on its face because it was not narrowly tailored to a compelling state interest.
These statutes raise questions that remain to be tested. The Arizona statute, for example, prevents the impersonation of a candidate or other person appearing on a ballot. It does not, however, prevent other impersonations, such as an impersonation of a news anchor discussing a candidate or representative. Further, the fact that most of these statutes only require a disclosure leads to additional questions.
Will disclosures effectively prevent the potential harm of these political advertisements? What if the recipient of a robocall hangs up before the disclosure plays? What if viewers simply do not see the disclosure?
The Florida primary election will be held on Aug. 20, and the general election on Nov. 5. The effectiveness of these statutes will become clearer in the weeks and months leading up to these elections.
Ken Tinkler is a government law specialist and shareholder with Carlton Fields in Tampa, Florida, who focuses on resolving disputes with and among government agencies involving land use regulation, environmental permitting, ethics regulation, and election law.
Caleb Spano is a 2024 summer associate with Carlton Fields in Tampa, Florida and a 2025 J.D. Candidate at the Stetson University College of Law where he is the Local Government Editor for the Stetson Law Review.