The U.S. Senate has proposed the DEFIANCE Act in response to the increasing proliferation of AI-generated, non-consensual explicit images, exemplified by the recent deepfake incident involving Taylor Swift. The bill aims to provide legal remedies to victims and criminalize the production and distribution of such content.
The U.S. Senate is currently considering the Stop Explicit Fake Images and Non-Consensual Editing Act of 2024, commonly known as the DEFIANCE Act. This bipartisan bill was introduced in response to growing concerns about non-consensual, sexually explicit ‘deepfake’ images and videos, particularly those created using: A.I (AI). The introduction of the bill was largely driven by recent incidents involving AI-generated explicit images of singer Taylor Swift, which quickly spread across social media platforms.
The DEFIANCE Act aims to provide federal civil relief to victims who may be identified as such “digital counterfeiters.” The term is defined in law as a visual depiction created to falsely appear real using software, machine learning, AI, or other computer-generated means. The law criminalizes the creation, possession, and distribution of non-consensual AI-generated explicit content. It would also establish a 10-year statute of limitations from the time a subject depicted in non-consensual deepfake content becomes aware of the image or turns 18.
The need for such laws was highlighted by a 2019 study that found that 96% of deepfake videos were non-consensual pornography and were often used to exploit and harass women, especially public figures, politicians and celebrities. The widespread distribution of these deepfakes can have serious consequences for victims, including job loss, depression, and anxiety.
Currently, there is no federal law in the United States that specifically addresses the rise in digital counterfeit pornography modeled after real people. However, some states, such as Texas and California, have their own laws. Texas criminalizes the creation of illegal AI content and criminals can face prison terms, while California allows victims to sue for damages.
The introduction of this bill comes as the issue of online sexual exploitation, especially of minors, is receiving great attention. The Senate Judiciary Committee is examining the role of social media platforms in the spread of such content and the need for legislative action in a hearing titled ‘Big Tech and the Online Child Sexual Exploitation Crisis.’
This legislative initiative highlights growing concerns about the misuse of AI technology in the creation of deepfake content and the need for a legal framework to protect individuals from such exploitation and harassment.
Image source: Shutterstock