Deepfake Danger

Photo from: Fili Santillán

By: Freda Donnelly @shesbasedbabe

We live in a deeply pornified society that is negatively affecting both genders, and it’s only getting worse. Scientific inquiries have indicated a correlation between prolonged exposure to explicit sexual content and adverse psychological outcomes. Individuals who regularly partake in such consumption may grapple with heightened levels of body dissatisfaction, diminished self-esteem, and increased anxiety. The intricate choreography of intimate encounters depicted in pornography further raises questions about the distortion of expectations within genuine relationships. In simple terms, pornography rots your brain and soul. And that’s just when one is a viewer.

Now imagine that you’re the subject of such materials when you’ve never consented to taking part in them. Perhaps I can give you a glimpse of what that’s like.
I had a stalker in middle school and high school. He’d show up everywhere I was except for church. To take just one of dozens of examples, every single doctor’s appointment I went to when I had to get dismissed from school, he was there. That, in and of itself, was highly disconcerting. But he didn’t stop there. We didn’t have AI—at least at the level it exists now—back in the day, so he had to settle for drawing nude pictures of me instead and passing them around to his friends. That’s how these things shook out back in the ’00s. This new generation has gone high-tech when it comes to lewd behavior against women, now deemed AI sexual violence, and it’s causing headaches and, worse, costing lives.

The Cost is Too High

This isn’t a matter of protected speech. As Michael Knowles put it, “It is hurting people because people are more than their bodies. And even if it weren’t, it’s an offense against art, nature, and God. It’s just intrinsically wrong and we should ban it and seriously prosecute people.” These outrages are happening to both famous and unknown women alike, and it’s devastating. Unfortunately, we don’t yet have the legal infrastructure in place to crack down on it.

A particularly heartbreaking example of this deficiency comes out of North London, where a lovely young girl named Mia Janin took her life after a group of boys purportedly bullied her and created fake pornographic images of her and other girls, per reports. The Jewish Free School (JFS) in North London, which Mia attended, claims to have known nothing about the 60 boys from JFS and potentially surrounding schools operating a group chat in which they shared the social media accounts of various girls, made fun of them, and put their faces on porn stars’ bodies. One student claims that the group would also pressure girls to send nudes and share them with the rest of the boys. These boys would further degrade the girls in class and on buses, where they would kick footballs at them while filming and calling them names. One of the names that they gave Janin’s group in the months leading up to her death was the “suicide squad.” This behavior is cruel and disgusting. The school’s claims that they were ignorant of such widespread harassment raises serious concerns about the institution’s oversight, accountability, and commitment to ensuring the safety and well-being of its students.

Since June 2023, England and Wales have deemed the sharing of deepfake porn illegal as part of the government’s efforts to combat individuals who disseminate intimate images online without the consent of those depicted, targeting abusers, predators, and former partners with malicious intent. Unfortunately, in the United States, several states have yet to revise their anti-revenge porn laws to encompass the use of technology in the creation and distribution of manipulated images.

When AI sexual violence occurs against girls and women, we need to remember first and foremost that these are our fellow humans. Each girl targeted is someone’s daughter, sister, niece, granddaughter, mother, or wife. At any time, she could be us.

The floodgates were opened this week on X and other social media sites when #TaylorSwift trended. An advocacy deepfake-detecting group referred to as Reality Defender found at least a couple dozen unique AI-generated images, according to the Associated Press, all showing a variation of a bloodied or painted Taylor Swift that objectified her and, in some cases, inflicted violent harm on her deepfake persona. Regardless of one’s perceived stature, this behavior is beyond the pale. Our response to those pictures should be completely independent of whatever perceived protection, luxury, or privilege Taylor Swift has. The images violate her fundamental rights to privacy, respect, and autonomy over her own image.

This is by no means a benign issue. Some on X have expressed a bittersweet relief over the fact that this is occurring to Time’s 2023 Person of the Year since she has a large platform to be an advocate for this issue, enacting needed legal change. However—while I’m an overall believer in silver linings myself—I don’t think that this issue being foisted upon the shoulders of Miss Taylor Alison Swift is just. She shouldn’t have to become an advocate over every horrendous thing that happens to her. Swift never chose this fight. While she’s known for being a fierce girl’s girl, she shouldn’t have to carry this burden, much like she shouldn’t have to carry army-grade bandages for knife or gunshot wounds. Nevertheless, rumor has it that she plans to sue. While the specific legal grounds are unknown, this case could be critical in setting legal precedent.

Hackers, malevolent individuals, and basement-dwelling ne’er do wells have been engaging in violating the images of both famous and unknown women for at least as long as I’ve been alive, and likely in forms that date back centuries. In the largely pornified society that we exist in today, women have been told things such as, “Don’t take sensual photos of yourself if you don’t wish for them to fall into the wrong hands.” This type of finger-wagging and so-called “victim blaming” was heavily present in 2014 when “Celebgate” occurred. During this time, a number of female celebrities, including actresses and musicians, had their private and intimate images leaked after hackers accessed their personal iCloud accounts. The hacked images were disseminated on various online platforms, including websites and forums. Affected celebrities included well-known names such as Jennifer Lawrence, Kate Upton, Kirsten Dunst, and others. The breach brought attention to issues of online privacy, cybersecurity, and the vulnerability of personal data stored in the cloud.

Several investigations were launched to identify the hackers responsible, and some individuals were eventually prosecuted for their involvement. The unauthorized sharing of these private images raised concerns about consent, privacy, and the ethics of accessing and distributing such personal content without permission. The incident prompted increased awareness about the potential risks associated with storing sensitive data online and led to discussions on improving cybersecurity measures for individuals and technology platforms alike. But we can easily see from the technological manipulation of normal images that the accountability aspect is not solely upon the shoulders of females who aim to go about their daily lives.

Taking Back Our Identity

Sadly, complete prevention is not possible at this given stage. There is an overwhelming list of actions that we as women can take to protect ourselves but at times it feels like even the average girlie needs a team of protectors to be kept safe from this form of wrongdoing. Here are a few easy and helpful strategies you can implement today to help protect yourself:

1) If you come across deepfakes or manipulated content featuring your identity, report it to the platform hosting the content and relevant authorities. Many platforms have policies against the sharing of non-consensual explicit content.

2) If something horrendous and malicious does occur, depending on the jurisdiction in which you reside, there may be legal options available. Consult with legal professionals to understand your rights and potential courses of action if you are a victim of deepfake abuse.

3) Regularly review and adjust privacy settings on social media platforms. Limit who can view and download your photos to trusted individuals. Further, developing an online presence that is wholesome and uplifting can tip people off that if nasty content comes out regarding you, it’s more than likely false and doesn’t originate from you.

4) It can also be helpful to keep yourself updated on advancements in deepfake detection tools. As technology evolves, thankfully, so do methods for identifying manipulated content. These actions can help mitigate malevolent usage of our identities.

In conclusion, the prevalence of deepfakes poses a serious threat to individuals’ privacy and mental well-being, as highlighted by incidents such as “Celebgate” and the recent victimization of Taylor Swift. The disturbing case of the 14-year-old girl in England, who suffered immensely and ultimately lost her life due to the actions of others creating and sharing explicit deepfakes, underscores the urgency of addressing this issue. As technology advances, the potential for malicious use of deepfake content grows, requiring a collective effort to strengthen legal frameworks, enhance online security, and raise awareness about the profound consequences of these manipulations. It is imperative that society acknowledges the real and lasting impact on the lives of those victimized by deepfake technology, reinforcing the need for stringent measures to protect individuals from the harmful repercussions of this rapidly evolving digital threat.

Freda Donnelly is the host of Finding the Faith podcast on Rumble and a freelance researcher, writer, and content creator