#1 - When deepfake videos are just jokes, they aren’t harmful
“Imagine that you found a video of you on the Internet where you were stamping on a kitten to death and everyone thought you were a complete asshole and people hated you and your boss was considering firing you and your family don't want anything to do with you and your friends are starting to leave and you don't know if your life is ever going to be the same again and you don't feel safe walking down the street because now loads of people hate you because of something you never did. That's how it feels to be suddenly put in sex work when you never consented”
Deepfakes are not jokes. They have serious personal and professional consequences for those who are targeted.
#2 - Deepfake videos are only harmful when someone believes they’re real
“My 60 year old dad if he were to see that video, I would never be able to convince him I didn't do it”
The distress of seeing a deepfake (and knowing others have seen it) is very real, as are the social harms of deepfake abuse. Victims also report that even when the content is clearly fake, the experience of non-consensually being featured in explicit content is humiliating.
#3 - It is okay to make deepfakes of celebrities because they are public figures
“As soon as you see them, your mind shifts like from saying “Oh, that's not my body why should I care?” to – “Was I just? ...Did I do that? Do people have this of me?””
Celebrities or people with a public profile do not deserve to be targeted by deepfakes. Non-consensual sexual abuse imagery should not be part of anyone’s job. Celebrities are real people and they suffer the same consequences as non-celebrities, including doubting their own recollections and worrying about other people’s perceptions of them.
#4 - Some people are more deserving of being targeted by deepfakes than others
“The minute that I seem sexually available in any way, shape or form, I'm fair game to humiliate physically assault, rape and attack... So, there's this idea that the minute you are consensually sexual, everyone can help themselves to you and I've experienced this and I've seen it, and I've seen it happen to other people””
The idea that those who have a sexualised image or who produce sexualised content are more deserving of being targeted by deepfakes is grounded in sexual entitlement and “slut shaming” (slut shaming is stigmatizing people, especially women, who violate expectations of appearance and behaviour in regard to sexuality). Deepfake abuse strips victims of their agency and consent and no one is deserving of that.
#5 - It’s okay to make deepfakes when there are no legal consequences
“In 2018, I was inebriated at a party and I was used for a man's sexual gratification without my consent. Today. I have been used by hundreds of men for sexual gratification without my consent. The world calls my 2018 experience rape. The world is debating over the validity of my experience today. This situation makes me feel disgusting, vulnerable, nauseous, and violated -and all of these feelings are far too familiar to me”
Legal consequences aside, the harm and distress that abusive imagery causes to victims can be profound and long-lasting. Many victims describe it as a form of sexual violence and equate their experiences to that of real-life sexual assault.
#6 - Deepfakes aren’t harmful if they’re not shared online
“But if you look at a whole bunch of these comments, you'll see that so many people out there don't understand what the problem with this is… ‘Is there really any difference between this and some random viewer sexualizing them in my mind?...You're not a victim. What's the problem? Why do you care? This just comes with the job. Like this is normal. This is just a sexual fantasy. It's not that deep.’... And there's a distinct lack of empathy and a lot of people who are genuinely confused as to what's the problem with it. And I'd say consent. Consent is the main problem. Like when someone doesn't choose to sexually engage with you but you sexually engage with them anyway, that is a problem.”
Making deepfakes for your own private use is still a violation of consent and individuals have been prosecuted for e.g. the creation of explicit deepfake images of children for their own personal use.