Fake explicit Taylor Swift images: White House is ‘alarmed’

Graphic AI-generated images of Taylor Swift appeared online this week.

This week, millions of people saw sexually explicit fake images of Taylor Swift made by AI on social media. This made many people realize how important it is to regulate the bad uses of AI technology.

On Friday, the White House press secretary told ABC News that what happened to Swift online “alarms” them and that Congress “should get involved.”

Fake explicit Taylor Swift images: White House is 'alarmed'
Taylor Swift performs onstage during “Taylor Swift | The Eras Tour” at Allianz Parque on Nov. 24, 2023 in Sao Paulo. Buda Mendes/tas23/Getty Images

According to White House Press Secretary Karine Jean-Pierre, reports of the circulation of the images you just showed are “alarming.” To be more specific, they are “false images,” and they are alarming.

However, she said, “social media companies make their own decisions about content management, and we think they have an important role to play in enforcing their own rules to stop the spread of false information and intimate images of real people taken without their permission.”

Fans were shocked to learn that there is no federal law in the U.S. that would stop or discourage someone from making and sharing non-consensual deepfake images. The White House is not the only one.

But just last week, Rep. Joe Morelle stepped up his efforts to pass a bill that would make sharing explicit digital photos without permission a federal crime punishable by jail time and fines.

A spokesperson for Morelle told ABC News, “We’re certainly hopeful that the news about Taylor Swift will help build momentum and support for our bill, which as you know would deal with her exact situation with both criminal and civil penalties.”

The congressman is a Democrat from New York. He wrote the “Preventing Deepfakes of Intimate Images Act,” which was backed by both Democrats and Republicans. It has been sent to the House Committee on the Judiciary.

A common way to describe deepfake pornography is as image-based sexual abuse, which also includes making and sharing real intimate pictures.

With the fast growth of AI technology, users used to need to know a lot about computers to make AI-generated content. Now, all they have to do is download an app or click a few buttons.

Now, experts say there is a whole business based on making and spreading digitally fake content that looks like it contains sexual abuse. Some of the websites that post these fakes have thousands of members who pay to be there.

Last year, a town in Spain made headlines around the world when several teenage schoolgirls said they got fake naked pictures of themselves made with an easy-to-find “undressing app” that uses AI. This started a larger conversation about the harm these tools can do.

The sexually explicit Swift images were probably made with a text-to-image tool that uses artificial intelligence. Some of the pictures were shared on X, the social media site that used to be called Twitter.

Before the account was shut down on Thursday, a post with screenshots of the fake pictures was seen more than 45 million times.

X’s safety team said early Friday morning that it was “actively removing all identified images” and “taking appropriate actions against the accounts responsible for posting them.”

“We have a zero-tolerance policy for non-consensual naturism (NCN) images,” the statement said. If there are any more violations, we will take action right away and remove the content. We promise to keep the site safe and polite for everyone who uses it.

Stefan Turkheimer, Vice President of Public Policy at RAINN, a non-profit group that fights sexual assault, said that every day “more than more than 100,000 images and videos like this are spread across the web, making them their own virus.” There is a lot of anger in our group, both for Taylor Swift and for the millions of people who can’t take back control of their images.