Katana VentraIP

Taylor Swift deepfake pornography controversy

In late January 2024, sexually explicit AI-generated deepfake images of American musician Taylor Swift were proliferated on social media platforms 4chan and X (formerly Twitter). The images led Microsoft to enhance Microsoft Designer's text-to-image model to prevent future abuse.[1] Several artificial images of Swift of a sexual or violent nature were quickly spread,[2] with one post reported to have been seen over 47 million times before its eventual removal.[3] These images prompted responses from anti sexual assault advocacy groups, US politicians, Swift's fans, Microsoft CEO Satya Nadella, among others, and it has been suggested that Swift's influence could result in new legislation regarding the creation of deepfake pornography.

Background[edit]

American musician Taylor Swift has been reported by journalists to have been the target of misogyny and slut-shaming throughout her career.[4][5] American technology corporation Microsoft offers AI image creators called Microsoft Designer and Bing Image Creator, which employ censorship safeguards to prevent users from generating unsafe or objectionable content. Members of a Telegram group discussed ways to circumvent these censors to creative pornographic images of celebrities.[6] Graphika, a disinformation research firm, traced the creation of the images back to a 4chan community.[7][8]

Cultural significance[edit]

Deepfake pornography has remained highly controversial and has affected figures from other celebrities to ordinary people, most of whom are women.[22] Journalists have opined that the involvement of a prominent public figure such as Swift in the dissemination of AI-generated pornography could bring public awareness and political reform to the issue.[23]