The lesson we must learn from Taylor Swift deepfakes: An opinion

Editor’s Note: Laurie Segall is a long-standing technology journalist and the founder of Mostly Human, an entertainment company that produces documentaries, films, and digital content focused on the intersection of technology and humanity. She is the writer of ”Special Characters: My Adventures with Tech’s Titans and Misfits.” In the past, she was CNN’s senior technology correspondent. The opinions expressed in this commentary are her own. Read more view on CNN.


Sexually explicit AI-generated photos of pop superstar Taylor Swift have inundated the internet, and we don’t need to calm down.

Angie Speranza

Laurie Segall

Swift may be one of the most renowned women in the world, but she epitomizes every woman and every girl when it comes to what’s at stake in the future of artificial intelligence and consent.

I’ve been extensively covering the impact of technology for nearly 15 years, and I believe sexually explicit deepfakes are one of the most substantial threats we face with advances in AI. With the proliferation of AI-generated tools and Silicon Valley’s inclination to race to innovate, we are entering a phase of tech that feels familiar — only now, the stakes are even higher.

We are in an era where it’s not just our data that’s up for grabs, it’s our most intimate qualities: Our voices, our faces, our bodies can all now be mimicked by AI. Put simply: Our humanity is a click away from being used against us.

And if it can happen to Swift, it can happen to you. The biggest mistake we can make is believing that this type of harm is reserved for public figures. We are now seeing a democratization of image-generating apps enabling this type of behavior. Did your crush reject you? There’s an app for that. Now, you can digitally undress her or create your own explicit deepfake starring her.

The problem will only deteriorate as we transition into augmented and virtual worlds. Imagine an immersive environment where a scorned ex invites others to collectively view a sexually explicit deepfake video of the girl who rejected him. Earlier this month, it was reported that British police are investigating the case of a 16-year old who alleged being raped in the virtual world by multiple attackers.  

I recently spoke to George Washington University professor Dr. Mary Anne Franks, who focuses on civil rights, tech, and free speech. She had a chilling warning: These types of apps and AI tools could lead to a new generation of young men with a “my wish is AI’s command” mentality. If we’re not cautious, not only will we create a new generation of victims, but also a new generation of abusers.

“We’ve just made all these tools — confused, resentful, angry young men are just using [them] instead of trying to sort through what it means to deal in a healthy way with rejection,” Franks said.

Utilizing advances in technology to humiliate women is nothing new. In 2015, I developed a series at CNN called “Revenge Porn: The Cyberwar Against Women.” At the time, non-consensual pornography — where a scorned ex or bad actor published naked photos of women on websites devoted to shaming them — was rampant. Like today, the laws had yet to catch up and tech companies weren’t yet making changes to protect victims.

… (Continued)

Leave a Reply

Your email address will not be published. Required fields are marked *