By: Emily Rawle (she/her)
Disclaimer: This blog will not share the names of victims but will use their statements to emphasise the impact of their experiences.
Trigger Warning: Sexual abuse / harassment, revenge pornography
There is nothing new about the idea of manipulating photos and videos to create illusions and a differing reality. The availability of technology like photoshop allowing us to manipulate media has only become better and more accessible. However, there is concern that things have already gone too far when it comes to the facial manipulation of videos. So, let’s talk about Deepfake.
Deepfake is our Sci-Fi dystopian nightmare come true, Deepfake started as the Video Rewrite program, which was created in 1997 by Christoph Bregler, Michele Covell, and Malcolm Slaney. The system allows us to alter existing video footage to create new content of someone mouthing words they didn’t speak in the original version. However, the technology that we see on social media is Ian Goodfellow’s invention of Generative Adversarial Networks (GAN), developed in 2014, which allows users to take a video, and place someone else’s face on it, a step above the older Video Rewrite system (Q5id, 2022).
Looking on social media apps, Deepfake technology has been used to create memes, an example of a popular genre of deepfake, is where it looks like Joe Biden is on a discord call with Donald Trump and Barrack Obama, doing everything from discussing their favourite types of Pokémon, to making fun of each other and American politics. There are even more wild examples including anyone. With only small voice and image samples needed to create anything, the only boundary seems to be imagination.
Concerns about Deepfake are not new. Politicians and businesses have always voiced concerns regarding this technology, and how easily it allows for misinformation to spread. When used maliciously, it can antagonise groups, blackmail people, commit fraud, and could potentially cause political disasters. Fears of Deepfake content are common on the internet, and some predict that it may soon be indistinguishable from authentic images.
People have already been caught using deepfake porn for self-satisfaction, despite the harm it does to victims. As his wife cries in the background, a famous streamer calls himself an idiot as he explains why he had deepfake pornography of his female peers in his online tabs. His wife sniffles into a tissue behind him as he claims he made a mistake and didn’t mean to hurt anyone by going onto that website, which he claims he saw in ads on Pornhub... Since the apology has gone viral, the streamer has gone offline, and the website in question has allegedly been deleted (Vice, 2023).
Public response was swift, and many people empathised with the women affected, but there were also those who felt that the man had done nothing wrong. Of course, there will always be people who won’t understand the issue, some are probably frothing over pornography of their celebrity crush. They can’t empathise because they have no fear of someone stealing their face and pasting it over pornography without their consent. Excusing deepfake porn by putting the blame on the victims for having an online presence perpetuates harassment of women who dared to have a large following.
There have already been many instances of male fans disrespecting female streamers and sexualising them unnecessarily, blaming it on the fact that they’re women in predominantly male communities. Many female streamers, including victims of this the incident, have already spoken publicly about spending thousands of dollars to have deepfake porn of themselves removed from the web, and so many empathised with the dread the victims must have felt when realising their friend consumed the pornography they paid so much to remove.
After the revelation, one of the female streamers violated released a statement saying, “I have created zero sexual content in my three years on twitch. Despite this, my face was stolen so men could make me into a sexual object to use for themselves… this situation makes me feel disgusting, vulnerable, nauseous and violated.”
Deepfake pornography is non-consensual sexualisation on a whole new level, it is not quiet and subtle, it reaches levels of dehumanising. It does not need to be analysed like other forms of abuse that we see, it is the blatant ignorance of consent for the sexual gratification of someone else. It is virtual rape, and it has already been used against people for revenge or humiliation.
A famous case of deepfake pornography being used against someone was in (enter year here). A female Indian journalist who reported the horrific assault of an eight-year-old girl suddenly found her online profile was swarmed with doctored pictures and videos of her in pornography. She faced constant harassment, with the images even being shared in political parties. She concludes that deepfake is “a very, very dangerous tool and I don’t know where we’re heading with it.” (Daze, 2022)
For the Huffington Post, she wrote about how humiliating and dehumanising it was, the experience sent her to the hospital for anxiety and heart palpitations, she even reported that when she went to report the harassment to the police, six officers were watching the video, while smirking at her. Action was only taken when the United Nations stepped in, imploring the Indian government to do something, but even after the harassment slowed, the impact has stayed with her. (Huffington Post, 2018).
A 2019 study found that 96% of deepfake content was pornographic in nature, and found that most subjects were of women, ranging from famous western actresses to K-Pop idols – some were of political figures of ultra-conservative countries in gay porn, to create unrest within said countries. Horrifyingly, the study found an “ecosystem” of websites and forums where people shared and discussed and even collaborated on creating deepfake porn, with users requesting pornographic deepfakes of women they know, such as ex-girlfriends. Predictably, this is a gendered issue. We have already been shown how far people are willing to go to sexualise people that have a large following.
The potential for this technology to be abused is a no-brainer, sadly it’s also a no-brainer that abuse using this technology is skewed towards women. Sex is a weapon used to shame and devalue women, and deepfake porn is making it easier to turn a woman who is not outwardly sexual, into a sexual object.
Imagine someone in your life messages you saying they’ve found a pornographic video of you online … but you’ve never even taken a nude, so how has someone found a video of you having sex? Deepfake technology has made it possible for anyone to take a video of your face, and impose it onto a pre-existing video, and to an untrained eye, it will always seem like it is you.
How can we protect ourselves?
It is never the victim's fault for being targeted, and it should not be a potential victim's responsibility to minimise their profile, but there are certain things we can begin to do as an intermediary while our laws and cybersecurity catch up. It can be difficult to predict whether you may be at risk of becoming a target of deepfake extortion or harassment, but profiles that are public are easier sources of collectable material. Standard cybersecurity measures can also be applied to minimise any risk of your images and videos being stolen and distributed by an anonymous third-party, best practices include a zero-trust policy. Verify anything you see, double check the source of information and messages, and do image searches if a suspicious profile tries to contact you. Watermarking or using a digital fingerprint on your images will make it harder for someone to create content on you (Security Intelligence, 2022).
It's sad that we’re at a point where precautions need to be taken so our faces aren’t stolen and pasted onto pornography without our consent, but it should not be surprising. For centuries, the sexuality of women has been shamed, while simultaneously, misogynists seek to sexualise women in any way that they can, whether it’s intended to gratify themselves, or tear her down.
The law needs to beat this issue before more women face the horror. Currently, the only country to strictly outlaw the sharing of deepfake pornography is the UK. In November 2022, legislators announced that deepfakes would be covered in the upcoming “Online Safety Bill”, which plans to update rules for harmful content online. Unfortunately, globally, there is not much legal coverage of deepfake technology, In the US, there are only three states (Virginia, Texas, and California) that reference deepfakes at all. (DPR, 2022). Currently in Australia, there is no existing legislation that addresses deepfakes directly, and there are already fears that the law is too far behind (Financial Review, 2019).
Hopefully with the conversation around deepfakes and their potential to cause harm getting louder, lawmakers pay attention and act on making this material illegal to create and distribute. There are already enough stories of people who have been targeted and sexualised against their will by people wanting to consume doctored pornography of them.
When it comes to the creation of deepfake pornography, there is no such thing as consent. This technology has always been incredibly dangerous and has the potential to destroy lives, whether it is used as revenge against an ex through pornography or is used to create a political scandal using misinformation.
Sources:
Ban, T. (2022). A Quick History of Deepfakes: How It All Began. [online] Q5id Proven Identity Management. Available at: https://q5id.com/blog/a-quick-history-of-deepfakes-how-it-all-began
Davis Political Review. (2022.). Deepfakes and American Law. [online] Available at: https://www.davispoliticalreview.com/article/deepfakes-and-american-law
Mukherjee, S. (2023). Atrioc caught paying for deepfakes of fellow streamers. [online] TalkEsport. Available at: https://www.talkesport.com/news/atrioc-caught-paying-for-deepfakes-of-fellow-streamers/
Cole, S. (2023). Deepfake Porn Creator Deletes Internet Presence After Tearful ‘Atrioc’ Apology. [online] www.vice.com. Available at: https://www.vice.com/en/article/jgp7ky/atrioc-deepfake-porn-apology.
James Vincent, (2022). UK plans to make the sharing of non-consensual deepfake porn illegal. The Verge. [online] 25 Nov. Available at: https://www.theverge.com/2022/11/25/23477548/uk-deepfake-porn-illegal-offence-online-safety-bill-proposal
Jessica White (2022). Inside the disturbing rise of ‘deepfake’ porn. [online] Dazed. Available at: https://www.dazeddigital.com/science-tech/article/55926/1/inside-the-disturbing-rise-of-deepfake-porn.
Ayyub, R. (2018). I Was The Victim Of A Deepfake Porn Plot Intended To Silence Me. [online] HuffPost UK.
https://www.huffingtonpost.co.uk/entry/deepfake-porn_uk_5bf2c126e4b0f32bd58ba316.Simonite, T. (2019). Most Deepfakes Are Porn, and They’re Multiplying Fast. [online] Wired. Available at: https://www.wired.com/story/most-deepfakes-porn-multiplying-fast/.
Sue Poremba. (2021). How to Protect Against Deepfake Attacks and Extortion. Security Intelligence [online] Available at: https://securityintelligence.com/articles/how-protect-against-deepfake-attacks-extortion/.
Davidson, J. (2019). Australian law behind the times on deepfake videos. [online] Australian Financial Review. Available at: https://www.afr.com/technology/australian-law-behind-the-times-on-deepfake-videos-20190703-p523pi.
Media Sources:
US Presidents Play Smash or Pass: US Presidents Make a Pokémon Smash or Pass Tier List
Maya Response: https://twitter.com/mayahiga/status/1620586546083803136?s=20&t=WfAYVdTqUO4L1SAQUZifQA
Aortic apology: https://twitter.com/Dexerto/status/1620091517259087874?s=20&t=4plN4PH5Pp0N3bPePw-Ecw
The worst take ever: https://twitter.com/RatiosCrazy/status/1620758134955573248?s=20&t=WfAYVdTqUO4L1SAQUZifQA