The creation of sexually explicit deepfake content is on the verge of being deemed a criminal offense in England and Wales due to mounting concerns surrounding the exploitation and harassment of women through the use of artificial intelligence.
According to a recent announcement by the UK justice department, a draft law is in the works that would make it illegal for anyone to create an explicit deepfake image or video of another adult without their consent, potentially resulting in a criminal record and an unlimited fine.
Deepfakes, which are manipulated images or videos often generated using AI, can give the false impression that an individual has engaged in actions or statements they have not.
The proposed law in England and Wales would encompass both pornographic images and nude deepfakes, regardless of whether the subject is depicted in erotic behavior.
The devolved governments of Scotland and Northern Ireland are yet to confirm if they plan to introduce similar regulations.
The new offense is expected to be incorporated into the Criminal Justice Bill currently under parliamentary review, following last year's amendments to the Online Safety Act that criminalized the sharing of deepfake sexual content in the two countries.
Notably, the creation of deepfakes has involved superimposing women's faces onto explicit images without their consent, with high-profile individuals like Taylor Swift falling victim to such practices.
Meanwhile, in the United States, lawmakers have introduced a draft civil law that would enable victims of sexually explicit deepfakes to take legal action against those responsible for creating and disseminating such content without consent.
Furthermore, a directive to criminalize the production of sexually explicit deepfakes has been proposed in the European Union, potentially leading to the implementation of corresponding national laws across its 27 member states.
UK Minister for Victims and Safeguarding, Laura Farris, emphasized that deepfakes are a means through which certain individuals seek to degrade and dehumanize others, particularly women. She stated that the new offense sends a clear message that creating such material is not only immoral but also misogynistic and criminal.
Meta, the parent company of Facebook and Instagram, has pledged to reassess its approach to handling deepfake pornography following the circulation of explicit AI-generated images of female public figures on its platforms.
The Meta Oversight Board Co-Chair, Helle Thorning-Schmidt, highlighted that deepfake pornography is increasingly being used as a tool for gender-based harassment online, aiming to target, silence, and intimidate women both in virtual and real-world settings.