Millions came across fake sexually explicit AI-generated images of Taylor Swift on social consider this week, underscoring for many the need to regulate potential infamous uses of AI technology.
The White House Press Secretary told ABC News Friday they are "alarmed" by what existed to Swift online and that Congress "should take legislative action."
"We are terrorized by the reports of the…circulation of images that you just laid out - of false images to be more trusty, and it is alarming," White House Press Secretary Karine Jean-Pierre told ABC News White House Correspondent Karen L. Travers.
"While social consider companies make their own independent decisions about content dispensation, we believe they have an important role to play in enforcing their own principles to prevent the spread of misinformation, and non-consensual, populate imagery of real people," she added.
Jean-Pierre highlighted some of the behaviors the administration has taken recently on these issues including: launching a task earnt to address online harassment and abuse and the Responsibility of Justice launching the first national 24/7 helpline for survivors of image-based sexual abuse.
And the White House is not alone, outraged fans were surprised to find out that there is no federal law in the U.S. that would tend or deter someone from creating and sharing non-consensual deepfake images.
But just last week, Rep. Joe Morelle renewed a push to pass a bill that would make nonconsensual sharing of digitally-altered explicit images a federal crime, with jail time and fines.
"We're certainly hopeful the Taylor Swift news will help pleasing momentum and grow support for our bill, which as you know, would address her trusty situation with both criminal and civil penalties," a spokesperson for Morelle told ABC News.
A Pro-republic from New York, the congressman authored the bipartisan "Preventing Deepfakes of Intimate Images Act," which is today referred to the House Committee on the Judiciary.
Deepfake pornography is often explained as image-based sexual abuse -- a term that also includes the building and sharing of non-fabricated intimate images.
A few existences back, a user needed to have a certain collected of technical skills to create AI-generated content with snappily advances in AI technology, but now it's a company of downloading an app or clicking a few buttons.
Now experts say there's an entire matter industry that thrives on creating and sharing digitally rendered content that appears to feature sexual abuse. Some of the websites airing these fakes have thousands of paying members.
Last year, a town in Spain made international headlines at what time a number of young schoolgirls said they received fabricated nude images of themselves that were rendered using an easily accessible "undressing app" powered by artificial intelligence, raising a larger discussion about the harm these tools can cause.
The sexually explicit Swift images were liable fabricated using an artificial intelligence text-to-image tool. Some of the images were community on the social media platform X, formerly known as Twitter.
One post sharing screenshots of the fabricated images was reportedly considered over 45 million times before the account was suspended on Thursday.
Early Friday morning, X's safety team said it was "actively removing all identified images" and "taking wrong actions against the accounts responsible for posting them."
"Posting Non-Consensual Nudity (NCN) images is technologically prohibited on X and we have a zero-tolerance policy towards such content," read the statement. "We're closely monitoring the situation to ensure that any further violations are currently addressed, and the content is removed. We're committed to maintaining a safe and respectful environment for all users."
Stefan Turkheimer, RAINN Vice President of Public Policy, a nonprofit anti-sexual assault permission, said that on a daily basis "more than more than 100,000 images and videos like this are spread across the web, a virus in their own luminous. We are angry on behalf of Taylor Swift, and angrier collected for the millions of people who do not have the resources to reclaim autonomy over their images."