Reading Time: 2 minutes
Taylor Swift’s AI-generated Nudes Circulate On Social Media
A post that was shared on X with Taylor Swift’s AI-generated nudes garnered over 45 million views, 24,000 reposts and hundreds of thousands of bookmarks before the X user’s account was suspended. A popular media outlet reports that the explicit images originated in a Telegram group that’s dedicated to abusive images of females using AI generators. The nudes were also shared on a satirical website known as Celeb Jihad.
Taylor Swift’s influence and possible stance to take legal action may boost campaigns for more action on the policing of creating and disseminating AI-generated content. A close source to Taylor Swift told the Daily Mail, “
“Whether or not legal action will be taken is being decided but there is one clear thing: these fake AI-generated images are abusive, offensive, exploitative… Legislation needs to be passed to prevent this and laws must be enacted”.
X asserted that it has a zero-tolerance policy toward the content. Immediately after the AI nudes were released online, it took a few minutes to find multiple accounts distributing the explicit images online. Reportedly it took seventeen hours for the fake images circulating on X to be pulled down.
Difference Between Deep-Fake AI Images and Photoshop Images
Mohit Rajhans, Think Start media and tech consultant says that there’s a lot of misinformation and likely harassment that’s likely to spread faster if this technology is not regulated. “What’s happened is…because of the use of Taylor Swift’s image to do everything from sell products that she’s not affiliated with to doctor her (image) into various sexual acts, more people have become aware of how rampant this technology is. There’s a lot of potential harassment and misinformation that gets spread if this technology is not regulated.”
Deepfakes are AI-generated images of false situations that feature real people. As opposed to photoshopped images, deep fakes are largely more realistic. According to a 2023 report, 98% of all deepfakes online were pornographic and 99% of those targeted were women. South Korean singers and actresses constituted 53% of those targeted in deepfake pornography. The report exerts that in the existence of technology, users can make a 60-second deep fake pornographic video in less than half an hour for free.