Fraudulent pornographic images of Taylor Swift created using artificial intelligence are circulating on social media, causing her faithful fanbase to question how there’s not more control around the nonconsensual production of X-rated images.
The images in question — referred to as “deepfakes” — depict Swift in various sexualized positions at a Kansas City Chiefs game, an acknowledgment of her highly-publicized relationship with the team’s tight end, Travis Kelce.
The source of the images or who initially shared them to X was not immediately evident, though as of Thursday morning “Taylor Swift AI” was a popular topic on the platform, with over 58,000 posts about the subject.
Swifties united and attempted to bury the images by posting an abundance of positive content about the 34-year-old songstress.
“How is this not considered sexual assault??” one X user inquired. “We are talking about the body/face of a woman being used for something she probably would never allow/feel comfortable how are there no regulations laws preventing this.”
“When i saw the taylor swift AI pictures, i couldn’t believe my eyes. Those AI pictures are disgusting,” another said.
Other outraged Swift fans labeled whoever created the “disgusting” and occurrences like these “ruin the [AI] technology.”
“Whosoever released them deserves punishment,” yet another chimed in.
Swift’s publicist, Tree Paine, did not immediately respond to The Post’s request for comment.
President Joe Biden signed an executive order to further regulate AI in October that prevents “generative AI from producing child sexual abuse material or producing non-consensual intimate imagery of real individuals,” among other things, including further oversight of the tech’s use in developing biological materials.
The order also demands that the federal government to issue guidance “to watermark or otherwise label output from generative AI.”
Nonconsensual deepfake pornography has also been made illegal in Texas, Minnesota, New York, Hawaii and Georgia, though it hasn’t been successful in stopping the circulation of AI-generated nude images at high schools in New Jersey and Florida, where explicit deepfake images of female students were circulated by male classmates.
Last week, Rep. Joseph Morelle (D-NY) and Tom Kean (R-NJ) reintroduced a bill that would make the nonconsensual sharing of digitally altered pornographic images a federal crime, with imposable penalties like jail time, a fine or both.
The “Preventing Deepfakes of Intimate Images Act” was referred to the House Committee on the Judiciary, but the committee has yet to make a decision on whether or not to pass the bill.
Aside from making the sharing of digitally-altered intimate images a criminal offense, Morelle and Kean’s proposed legislation also would allow victims to sue offenders in civil court.
In an example of how convincing this technology can be, several Swift fans were reportedly scammed out of hundreds of dollars earlier this month after tricksters released advertisements employing AI-generated video of the Grammy winner peddling Le Creuset in an attempt to steal money and data from fans.
The ads — which can be found across all social media platforms — show Swift, 34, standing next to the Le Creuset Dutch oven, which, according to the official website, runs anywhere from $180 to $750 depending on the size and style.
Earlier this year, other deepfake images of Pope Francis in a Balenciaga puffer jacket and Donald Trump resisting arrest also took the internet by storm.