The White House, Congress, and SAG-AFTRA are all mobilizing against those Taylor Swift porn deepfakes

ByWilliam HughesComments (6)
We may earn a commission from links on this page.
Taylor Swift
Photo: Axelle/Bauer-Griffin/FilmMagic (Getty Images)

As regular, mortal humans, we don’t really have a good way to come to terms with just how famous Taylor Swift has become these days. She blew past the level of “household God” a couple of record-setting tours and smashed Spotify metrics ago, and has now become a sort of mobile, album-dispensing avatar of celebrity itself. If something happens to Taylor Swift, it matters, at least in part because it happened to Taylor Swift.

Watch
Taylor Swift takes down Spider-Man
Share
Subtitles
  • Off
  • English

Which is why it’s not wholly surprising to learn that the recent efforts by online subterraneans to flood the internet with AI-generated deepfake pictures of the musical superstar, shown in suggestive or pornographic situations, have not only drawn the ire and attention of the Taylor Swift Fan Swarm—that borderline omnipotent force of great and terrible focus—but also SAG-AFTRA, the U.S. Congress, and the White House itself. All of whom have made public comments this week, expressing their desire to see action taken against the propagation of the images, up to and including making their spread a federal crime.

White House Press Secretary Karine Jean-Pierre, for instance, said the Biden administration was “alarmed” by the spread of the images, which flooded the internet earlier this week—and which were then countered by highly mobilized efforts to fill “Taylor Swift AI” and other such search terms with banal and benign images. “We are alarmed by the reports of the…circulation of images that you just laid out,” Jean-Pierre said. “Of false images to be more exact, and it is alarming.”

Meanwhile, ABC News reports that New York Rep. Joe Morelle made a push just last week to have “nonconsensual sharing of digitally-altered explicit images” named a federal crime; a representative for Morelle said this week, of the Swift images, that, “We’re certainly hopeful the Taylor Swift news will help spark momentum and grow support for our bill, which as you know, would address her exact situation with both criminal and civil penalties.”

And SAG-AFTRA (which went on strike last summer in part because of serious worries about the use of AI) came out swinging for member Swift, saying in a statement that, “The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal. As a society, we have it in our power to control these technologies, but we must act now before it is too late.”

Of course, none of this is happening just because it happened to the planet’s most famous Kansas City Chiefs fan: AI image generation has gotten both a lot more powerful, and a lot more accessible, over the last two years, and the issues surrounding deepfakes, and especially sexually explicit deepfakes, are only going to get more prominent and pressing in the near future. As she often does, Swift is largely serving here as an incredibly useful lightning rod; things matter because they happen to her, but that also means that they can be more directly addressed, as well. (See, for instance, the response from Elon Musk’s Twitter, usually molasses-slow to come down on anything you could remotely spin as “free speech”; the service has reportedly made moves to make it harder to search for the images, which are in explicit violation of its terms of service.)



Post a Comment

Previous Post Next Post

Contact Form