Now Reading
Addison Rae & The Danger Of Deepfakes

Addison Rae & The Danger Of Deepfakes

Addison Rae has found herself at the center of controversy over the past week. 

ICYMI, an x-rated video showing Addison engaging in a sexual act has gone viral. Initially shared in October 2020 on Twitter, the video has since been debunked and proven to be a “fake.” 

Mystery Twitter user, @Idkijustworkh11, posted the explicit video. However, the account was recently suspended for violating Twitter’s media policy. 

@Idkijustworkh11 allegedly posted three times before being deleted, all of which were fake x-rated videos of Addison. 

@defnoodlesspicy

Addison Rae’s leaked video is actually a deep fake #greenscreen #addisonrae #defnoodles v

♬ Do It To It – ACRAZE

As fans flood Twitter defending the TikTok dancer, two central questions remain: how did a fake explicit video of Addison go viral? And more importantly, why did so many Twitter users believe that the video was real? The simple answer is deepfake technology.  

Deepfakes are a form of ‘Synthetic Media’

Put simply, Deepfake is a technology that relies on artificial intelligence (AI). Through this, Internet users can replace the “likeness of one person with another in video and other digital media.”  

In other words, deepfake technology is Photoshop’s sinister sister.  

Deepfakes rose to viral fame on the r/deepfakes subreddit. In 2017, users began sharing edited pornographic videos featuring celebrities. Although Reddit has since banned r/deepfakes and other related subreddits, this has not stopped other online communities from sharing and creating “fake celebrity porn.” 

According to a study by AI firm Deeptrace Technologies, approximately 15,000 deepfake videos had gone live by September 2019. Perhaps more concerning was the fact that “96% (of these videos) were pornographic and 99% of those mapped faces from female celebrities on to porn stars.” 

Deeptrace also found that among the top four websites devoted to deepfake pornography, the first platform established had garnered almost 135 million video views between 2018 and 2019.  

Given that viewership and creation of this type of content have continued to skyrocket, it seems there is an alarming demand for this type of non-consensual pornography. 

See Also
Vidcon TikTok Header

The Dangers of Deepfakes 

Creating a realistic deepfake requires detailed facial data. With this in mind, it is no surprise that most deepfakes use celebrity faces. However, just because celebrities and social media influencers are public figures doesn’t mean that this technology isn’t dangerous for us average social media users.  

For many of us, deepfakes are blurring the lines between truth and fantasy. With the majority of Internet users unaware of deepfake technology, these clips often deceive users into thinking that these videos are a reality. Over the past few years, we have seen deepfake videos of President Barack Obama and Mark Zuckerberg go viral. With these AI-generated figures often voicing controversial and offensive opinions, these videos destabilise the already volatile media and political landscape.

While deepfakes are dangerous for political stability, they also pose a threat to the women and their safety. As the Internet’s latest victim, the viral deepfake of Addison is the perfect example of how this pornography format is almost always non-consensual. As author and Professor Danielle Citron told Deeptrace, “Deepfake sex videos say to individuals that their bodies are not their own.” By weaponising deepfakes to degrade and silence women, these AI-generated videos uphold patriarchal ideas of women as sexual objects. 

With deepfakes continuing to harm female celebrities and regular women alike, the digital space quickly becomes unsafe for many women. This type of content can quickly ruin careers, reputations and compromise mental health.

Ultimately, as deepfakes become more realistic and accessible, internet users continue to express their sexual fantasies in technological reality. Not only does this compromise internet users’ sense of reality, but also poses a direct threat to the wellbeing and safety of women.

Scroll To Top