CELEBRITY
X suspends account that posted Taylor Swift AI porn – only for another account to show it – as same graphic images now circulate on Facebook and Instagram
The images are also circulating on Facebook and Instagram
Reddit has taken action and deleted a channel called ‘TaylorSwiftLewd’
They show the singer posing provocatively in Kansas City Chief gear
X has suspended an account that posted AI pornography of Taylor Swift – but several others have already popped up with the same graphic images.
The extremely graphic AI-generated images, which showed the singer posing provocatively in Kansas City Chief gear, prompted outrage from her fans on Thursday, with many demanding legal action be taken.
The backlash led to the suspension of one X account sharing the images, but not before they started being shared by other accounts.
Moreover, the images are also circulating on Facebook and Instagram.
The new images show Swift in various sexualized poses. It’s not clear where the images originated from. On Thursday morning, ‘Taylor Swift AI’ was a trending topic on X, formerly known as Twitter.
X has suspended an account that posted AI pornography of Taylor Swift – but several others have already popped up with the same graphic images
The extremely graphic AI-generated images (not shown), which showed the singer posing provocatively in Kansas City Chief gear, prompted outrage from her fans
Reddit, meanwhile, appears to have taken action against the fake images, deleting posts featuring them and banning a
Reddit, meanwhile, appears to have taken action against the fake images, deleting posts featuring them and banning a channel named ‘TaylorSwiftLewd.’
The AI images do not seem to be circulating on TikTok.
DailyMail.com has seen the images in question but will not be publishing them.
They are the latest example of the dangerous rise in popularity of deepfake porn websites, where celebrities and others are finding their likeness plastered across explicit videos and photos without giving permission.
Nonconsensual deepfake pornography is illegal in Texas, Minnesota, New York, Virginial, Hawaii and Georgia. In Illinois and California, victims can sue the creators of the pornography in court for defamation.
‘I’m gonna need the entirety of the adult Swiftie community to log into Twitter, search the term ‘Taylor Swift AI,’ click the media tab, and report every single AI generated pornographic photo of Taylor that they can see because I’m f***ing done with this BS. Get it together Elon,’ one enraged Swift fan wrote.
‘Man, this is so inappropriate,’ another wrote. While another said: ‘Whoever is making those Taylor Swift AI pictures is going to hell.’
‘Whoever is making this garbage needs to be arrested. What I saw is just absolutely repulsive, and this kind of s**t should be illegal… we NEED to protect women from stuff like this,’ another person added.