Fighting for Control: How Sex Workers Survive Deepfake Proliferation

Fighting for Control: How Sex Workers Survive Deepfake Proliferation

The uncontrolled acceleration of artificial intelligence (AI) and its uses has induced anxiety for many in a multitude of ways: from automating prospective job opportunities and eliminating entire career paths to being maliciously used to co-opt our identity and ideas. The threat, and the already realized consequences, that these technologies present are felt deeply and widely.  This is especially true to sex workers online.

As headlines of deepfake technology plaster the news, pornography is often centred in the debate. Deepfakes – which broadly refers to content that seems genuine but has been manipulated or completely generated by AI – are prevalent in their weaponization in revenge porn, and in their commercialization by the porn industry. But what about the sex workers whose entire livelihood depends on pornographic content creation? At the nexus of being victimized by unconsented use or representation in deepfake as well as facing income precarity due to deepfakes’ increased use, online sex workers feel the brunt of a skyrocketing use of AI technologies. 

The Rise of Deepfake Porn

Deepfakes are clearly dangerous in their ability to artificially depict real people and events, allowing for misuse in blackmail, disinformation schemes, and, notably, non-consensual pornography. Pornography in general has been greatly shaped by AI in the past few years. Content is now customizable, usually depicting women who hyperadhere to the user’s preferences in areas like appearance, personality, and sexual prowess. Despite being artificially generated, this content obviously relies on human-made material to train on, and can further be used to fully recreate someone else’s likeness. Despite the undeniable recency of the explosion in their use, the leveraging of AI and machine learning to generate non-consensual porn has been an ongoing discussion within feminist and sex-worker circles for over a decade

The Harms of Image-Based Sexual Violence 

The use of people’s faces or bodies both as a source for generation and in the final end-product of the deepfake constitutes a form of image-based-sexual abuse (IBSA): the non-consensual distribution of pornographic content. In the 2010s, most victims of deepfake IBSA were celebrities, with Taylor Swift, Gal Gadot, and Scarlett Johansson all being shown in poorly-generated pornographic contexts. While it was still a serious violation of the aforementioned women, the technology was significantly less sophisticated and required a lot more equipment to perform, preventing it from fully diffusing to common use and victimization. Deepfake IBSA can target anyone – although it certainly predominantly affects women – and can have harrowing emotional repercussions on those affected by it. Sex workers online, however, are unique in the fact that these repercussions directly extend into their livelihood, adversely affecting their income as well as their safety.

Centering Sex Workers

‘Sex workers’ is a broad category that refers to “adults who receive money or goods in exchange for consensual sexual services or erotic performances, either regularly or occasionally.” This of course extends into the digital space. Most notably, a category of sex workers has emerged from platforms such as OnlyFans, Fansly, and Fanvue, where content is held behind a paywall and creators are able to keep the majority of their commission. While these platforms are certainly not perfect and face criticism from creators, they undoubtedly present advantages in comparison to the notoriously exploitative porn industry and the risks associated with in-person sex work. However, having their entire livelihood be contingent on the production of digital content,, renders these digital sex workers particularly vulnerable to deepfake. Sex workers have rang the alarm on deepfakes and the fact that 98% of deepfake content is pornographic, and therefore is based on the theft of sex workers’ digital bodies. This erases them as workers, copyright owners, and as people. 

The Fight for Control

Sex workers—digital or otherwise—have long fought for control over their bodies, labour, and working conditions. Significant gains have been made: many decriminalization efforts have been successful and plenty have unionized. However, the control that advocates had lobbied for may be eroded by deepfake. Online sex workers can have their content co-opted in many ways, and all equally and simultaneously amounts to IBSA, as well as theft of income and intellectual property. Many sex workers express deep concern for their future vis-a-vis deepfake, yet also are left unsupported by current online regulations that instead further victimize them. Sex workers, despite being at risk in such an intersectional fashion, are largely unconsulted by policymakers in discussions of AI regulations, undermining the control and autonomy that they seek. 

Legal Considerations

Deepfakes have been described as “misogyny in action.” They represent the loss of control over one’s own images; the imagery that deepfakes steal and use, after all, belongs to someone. With the added component of this imagery being a source of income, there are important questions raised regarding sex workers’ paths for recourse both in the context of copyright infringement and employment law. Victims of IBSA are using copyright law to reclaim ownership over their identities, relying on current legislation such as the USA’s Digital Millennium Copyright Act which facilitates the removal of copyrighted material. Platforms have vowed to also do their part in combatting deepfake IBSA through new reporting procedures, policies, and terms of services; they also can protect their creators’ content through their digital rights management. However, that is easier said than done.

Access to legal recourse is a primary concern as sex workers’ control over their labour is limited, and their existence is scrutinized. An important intervention should be made that not all digital sex workers limit themselves to the solely online content creation, and often partake in other forms of sex work; further, online sex workers perceive high levels of societal stigma around their career. This can result in both criminalization and stigmatization that can obfuscate the path to justice through a legal system that is often criticized for being already hostile to sex workers. That being said, some sex workers are using employment law as a means to gain control of their intellectual property. 

The Sex Worker as a Worker

Sex workers usually qualify as independent contractors – which while deceptive in some cases, is rather apt for the average OnlyFans/OnlyFans-adjacent creator. While brothel workers and strippers alike have sought to unionize as employees to protect against deepfake and overall exploitation of their likeness, digital sex workers may not have the same recourse. Advocates for including all sex workers as employees, however, argue that such classification can result in better protection by the law – especially as deepfake further pervades the industry.

A Use for Deepfake? 

In centring sex workers’ agency in this conversation, it’s important to mention that not all online sex workers consider deepfake a threat. Instead, some view it as an opportunity. Consensual deepfake and AI in general can hypothetically help sex workers expand their work through the creation of ‘clones’ – although these types of clones can easily be reproduced and co-opted without their knowledge. And these clones are surging in popularity, trumping the real thing and further undermining the income of millions of sex workers. 

A Necessary Perspective 

If one thing is obvious, it’s that sex workers, specifically those who produce digital content, are at risk in more ways than one when it comes to deepfake’s tightening grip on the sex industry. Beyond the traumatic reality of IBSA, sex workers are forced to reckon with AI’s capacity to potentially eliminate the control that they have fought so hard for. When policymakers and parliamentarians join to discuss deepfake, it is crucial that they consult the people who feel its expansion the most.

Edited by Syona Vashisth.

By Lauren Avis

Related