X has shut down searches for Taylor Swift as it works to remove all AI-generated nude images of her from the platform.
Videos by Suggest
Searches for “Taylor Swift” or “Taylor Swift AI” currently come back with a message reading “Something went wrong. Try reloading.” Fans can still search for “Taylor Swift pictures.” However, the results only included innocent pictures of the singer.
The shutdown comes after deepfakes of Swift hit X on Jan. 24. The explicit AI-generated photos first appeared on Celeb Jihad and quickly spread to other social media sites, as well. Taylor Swift is not the first celebrity to fall victim to the hoax, which has SAG-AFTRA and the White House expressing concerns.
“We are alarmed by the reports of the…circulation of images that you just laid out – of false images to be more exact, and it is alarming,” White House Press Secretary Karine Jean-Pierre said during a press conference.
She continued, “While social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation, and non-consensual, intimate imagery of real people.”
SAG-AFTRA Urges Lawmakers to Keep Taylor Swift and Others From Falling Victim to More Deepfakes
According to Deadline, people viewed Taylor Swift’s X deepfakes over 27 million times and had more than 260,000 likes. The platform shut down the account that posted them within 19 hours.
Once the news hit, Swift’s fans began flooding social media with “Taylor Swift AI” attached to positive images and messages in an attempt to push down the nude images. Fans also created the new trending hashtag “Protect Taylor Swift.”
“The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal,” SAG-AFTRA wrote in a statement. “As a society, we have it in our power to control these technologies, but we must act now before it is too late.”
“SAG-AFTRA continues to support legislation by Congressman Joe Morelle, the Preventing Deepfakes of Intimate Images Act, to make sure we stop exploitation of this nature from happening again. We support Taylor, and women everywhere who are the victims of this kind of theft of their privacy and right to autonomy.”