Explicit deepfakes of Taylor Swift are currently circulating the internet, rightfully sparking outrage among her fans. The photos were taken down on X, formerly known as Twitter, but before that, they were viewed upward of 45 million times. Luckily, the situation garnered the attention of Congress, and steps are being taken to criminalize the use of AI to generate nonconsensual sexual images.
The images are violating, insulting and deeply disturbing, and it is good people are angry and calling for justice for Swift. Hopefully, a law will pass. However, the making of those images is rooted in something deeper that no law can prevent: the troubling lengths a misogynistic society will go to in order to degrade and humiliate women.
Sexually violating images such as the ones currently circulating are not even actually about sex, they are about power — AI is just the newest weapon misogynists are wielding to exert theirs. It comes as no surprise that the latest victim is Swift, who is undeniably the most successful woman in the world right now, after being named Time’s Person of the Year in 2023 and headlining the highest-grossing concert tour of all time.
Swift’s power cannot be touched, so what other option do misogynists have but to digitally manipulate her likeness into nonconsensual porn? To remind her that even though she is on top of the world, she is still a woman, and her body still does not belong to her? To remind her that no matter how successful she is, she can always be brought down?
It is also important to note that Swift is not the first woman to experience sexual violence at the hands of misogynists using AI, though she is the most well-known. The issue has been ongoing for years. In 2021, a 14-year-old girl took her own life after boys in her class shared deepfakes of her in a group chat. Just last November, boys at a New Jersey high school spread AI-generated images of their female classmates.
Deepfakes are victimizing women universally, and while yes, they are a threat to all women, it is important to recognize the greater threat they pose to women who are not celebrities – women who don’t have a 100-person team and the entire world to defend them.
A study found that 96% of deepfakes are nonconsensual sexual images and videos, and among the top four deepfake websites, 100% of the targets are women and girls. These numbers are extremely concerning, and lawmakers should have started caring about it long before the most famous woman in the world was affected by it.
Ultimately, passing laws to control the usage of AI and prevent nonconsensual sexual deepfakes from being made is a good idea; it can’t hurt and it is long overdue. That being said, laws will never be able to prevent the desire to make those images in the first place, and misogynists will no doubt find a new weapon of choice against women once the option of AI is gone.
Brianna Tassiello is a junior studying journalism at Ohio University. Please note that the opinions expressed in this article do not represent those of The Post. Want to talk to Brianna? Email her at bt977520@ohio.edu.