Deepfakes are about to get a lot worse


Search results for deepfake porn.
The dangers of accessible AI grow beyond ChatGPT’s ability to do homework. There are real, harmful privacy and consent issues arising. (Tomoki Chien | Daily Trojan)

As AI proliferates into the general populace, its usage for deep fake pornography seems to be horrifyingly on the rise. Ethical implications about academic writing, article writing and more are commonplace in the AI discussion, but there are particularly important ethical issues where images are concerned. AI has the power to create images of people out of thin air using pre-existing images and videos, colloquially referred to as deepfakes. 

Deepfakes and AI-generated images are not always picture perfect. AI often adds one or two fingers too many, and the speaking audio and video may not line up, making the visuals presented seem robotic. Nonetheless, AI audio can often be quite realistic and even used for light hearted purposes, as seen recently with the TikTok trend where users make videos of  President Joe Biden and former Presidents Barack Obama and Donald Trump playing video games together. These are generally discernible by the public, but in the realm of pornography, AI is facing issues regarding a lack of consent.

Deepfakes and deepfake pornography have been an issue for a while, with their inception in 2017 when a Reddit user doctored pornographic content of celebrities, including Gal Gadot, Taylor Swift and Scarlett Johansson. In a 2019 report by cyber-security company Deeptrace, researchers found that 96% of deep fake content is pornographic content. However, AI gives it new power. 

There, seemingly, was a rise in the usage of deepfake porn apps in the beginning of the year following the exposé of a Twitch streamer named Brandon Ewing. Ewing had accidentally revealed he had made AI porn of his female friends, many being his fellow Twitch streamers. While major companies, such as Reddit, PornHub and Discord, have banned deepfake porn and apps the produce it and many smaller websites still run rampant. 

Only California and Virginia have laws explicitly protecting women who find deepfake pornography of themselves, leaving most women unable to legally advocate for the removal of the content or though, press charges. Celebrities and public figures are far from the only individuals affected. SXSW documentary “Another Body” shows that it can happen to anyone, even a random college girl. Finding deepfake pornography of oneself is increasingly becoming reality. 

Autonomy over one’s own images, or lack thereof, is far from a novel battle in the age of social media. However, in a world where sex is, unfortunately, too often used as a power tool, the ability to manipulate images into pornographic ones has a potential to destroy the lives, careers and mental wellbeing of women everywhere.  

Revenge porn has long wreaked havoc. While the Child Pornography Prevention Act of 1988 restricted the use of computer-generated images of minors engaging in sexually explicit content , there weren’t any laws barring the non consensual distribution of lewd images of those who are of legal age. 

In recent years, federal legislation has been passed to criminalize revenge porn. Now, though, he ability to create AI deepfake porn probes a whole new ethical dilemma and a whole new legal fight. Just as we begin to nail the coffin on revenge porn, AI porn opens a whole new uphill battle women must fight to retain autonomy over our images, or at least, what little is now left of it.