by Katelyn Melville

Artificial intelligence (AI) is becoming too advanced – but not for the reason you think it is. As we’ve seen a significant rise in the use of AI to complete and assist with various tasks, there’s been a widespread fear among the public, fearing that AI will replace humans, revolt, etc. While these possibilities are expected and valid, many people disregard the much more current consequences of AI’s normalization – deepfake AI pornography. 

According to techtarget.com, “Deepfake AI is a type of artificial intelligence used to create convincing images, audio and video hoaxes.” It takes an image of someone’s face or body and fixes it onto a video, usually used to frame someone for doing something they did not do. In this case, deepfake AI is being used to take images of various people’s faces, mostly women, and place them on pornographic videos, making it seem as if they were the ones performing this act. 

Recently, these videos have become increasingly common and popularized, with countless people being affected, including global superstar Taylor Swift. In late January of this year, when deepfake videos originating on the social media site 4chan, known for its offensive material, were mass-posted to X (formerly known as Twitter), portraying her in a multitude of explicit contexts. 

Still, due to her popularity, amidst the social media mockery and disrespect, her fans relentlessly advocated for her, creating the hashtag #protecttaylorswift and flooding X and other social media with positive comments and support. 

But what happens when a woman who is less popular than Taylor Swift gets deepfaked? Or a teenager, perhaps? What are they meant to do if someone decides to create these images of them and distribute them?

Per a study done in 2023 by the Home Security Heroes, deepfake pornography makes up 98% of all deepfake videos online, and 99% of the individuals targeted in deepfake pornography are women. Creating these videos is alarmingly fast and cheap to make, taking less than 25 minutes and costing $0 to create a 60-second pornographic deepfake with just one clear face image. 

Having pornographic material spread of you can be incredibly emotionally damaging, as well as embarrassing and shameful. No one wants their positive reputation to be precluded by fabricated explicit content that was made with the pure intention to humiliate them. 

These are some of the most terrifying possibilities to a young woman like myself who is in the target demographic for this to happen, especially when it is generally legal. From the podcast WBUR, “No federal law criminalizes the creation or sharing of non-consensual deepfake porn in the United States. A lawsuit is also unlikely to stand up in civil court.” 

Many argue that that because the images are fake, they are lighthearted and won’t be taken seriously, but that is untrue. In 2020, Elle Magazine published a profile on Australian woman Noelle Martin, who was “emailed graphic videos of herself performing sex acts,”that weren’t real. The detail in the videos was astonishing; she “watched as my eyes connected with the camera, as my own mouth moved. It was convincing, even to [her].” While she knew it was fake, with a video so real-looking, who would believe otherwise? 

She tried tirelessly for years to get rid of them, but her efforts were only met with demeaning comments. She was told, “If you don’t want it, don’t post any images of yourself.” Martin spent her adult life helplessly watching these falsified images used against her. 

As she tried to get the videos removed, she was blackmailed and exploited, only for the videos to reappear as soon as they were taken down.

Now, imagine this same traumatic experience happening to someone under the age of 18, who is in an even more powerless position with factors of their age coming into play. Imagine having to roam your high school knowing that every person in the building has access to explicit material about you that isn’t even real. It will undeniably be mentally taxing and demeaning. It can lead to bullying, in this instance, cyberbullying. The National Institutes of Health reports that cyberbullying is closely linked with suicide in adolescents. 

As of 2020, in Australia, the distribution of nonconsensual deepfake pornography is completely criminalized, and according to Elle Magazine,  there is “a government agency dedicated to helping victims remove material and finding social media companies who fail to remove internet images from their platform within 48 hours.” This legislation needs to be implemented in the United States to prevent this from affecting even more women. 

Without legislation, this will continue to happen and continue to haunt women everywhere, especially because the internet never forgets. Some states, such as Texas, Hawaii, Virginia, and Georgia, have come forward to begin criminalizing the distribution of nonconsensual deepfake pornography, but until it is federally acknowledged, it will continue to spread and harm women. Sensity says that deepfakes are growing exponentially, and at the rate they are growing, it does not seem like it will slow.  

If federal legislation is passed, it will show the world that our federal government truly cares about protecting women and their reputations and is willing to fight for justice for all the people affected. 

Let this be a reminder to students and the public in general that not every image on social media is real, and regardless of the validity, it should never be distributed. The consequences are irreversible.

Leave a comment

Trending