Taylor Swift AI Deepfakes Spark Call for Tougher Policies

Article Image

In January, Taylor Swift became the latest victim of AI-generated fake nude images. Experts say we need new laws to stop this kind of AI abuse.

Taylor Swift had a great end to 2023. Her Eras Tour broke records, her concert film was a hit, and Time named her Person of the Year. But in late January, things took a dark turn when fake nude photos of her, made with AI, spread online.

Swift's fans quickly reported the images, and X (formerly Twitter) temporarily blocked searches for her name. This isn't new—women and girls worldwide have faced similar abuse. But Swift's fame brought more attention to the issue, and more people are now calling for new laws.

"We're behind on this, but we can still try to fix the mess that's happening," says Mary Anne Franks, a law professor at George Washington University. She warns that this problem won't stop with celebrities or teenage girls. "It's going to affect politicians, world leaders, and elections."

Swift, who's now a billionaire, might be able to fight back with lawsuits, Franks says. (Swift's team didn't comment on whether she'll sue or support efforts to stop deepfakes.) But Franks thinks we really need new laws banning this kind of content. "If we'd passed laws years ago when experts warned about this technology, we might not be in this situation," she says.

One proposed law that could help is the Preventing Deepfakes of Intimate Images Act, introduced last May. It would ban sharing fake nude photos without consent. Another Senate proposal would let victims sue the people who make and share these fakes.

Experts have been asking for these laws for years. Some states have their own rules, but there's no strong federal law yet. "There's not much federal law about adult deepfake porn," says Amir Ghavi, an AI lawyer at Fried Frank. "There are some related laws, but no specific federal deepfake law."

But even a new law might not solve everything, Ghavi explains. It's hard to find who to charge with the crime. "It's unlikely these people will identify themselves," he says, noting that we can't always tell what software made an image. And even if we could, a law called Section 230 might protect websites from being responsible for what users post. (It's not clear yet if this law applies to AI-generated content.)

Some human rights groups, like the ACLU, worry that laws that are too broad could cause problems for journalists reporting on deepfakes or political satirists using them.

The best solution might be policies that make AI companies more responsible, says Michael Karanicolas from UCLA's Institute for Technology, Law and Policy. But he adds, "Companies usually only respond to strict regulations." Some platforms have tried to stop AI-made political misinformation, so it's possible, Karanicolas says—but tech-savvy users can often find ways around safeguards.

Digital watermarks, which show when content is AI-made, are one idea the Biden administration and some in Congress support. Soon, Facebook, Instagram, and Threads will start labeling AI-made images. Even if watermarks can't stop people from making deepfakes, they could help social media platforms take them down or slow their spread.

One former policymaker, who advises the White House and Congress on AI rules but wanted to stay anonymous, says moderating content at this scale is possible. They point to how social media companies limit copyrighted material. "We have both legal and technical ways to slow this down," they say. They think Swift—who's as famous as some presidents—could get regular people to care about this issue.

For now, though, the legal situation is unclear, leaving some victims feeling helpless. Caryn Marjorie, a social media influencer and Swift fan who made her own AI chatbot last year, had a similar experience to Swift. About a month ago, her fans told her about fake nude photos of her online.

The fakes upset Marjorie and made it hard for her to sleep. She reported the account posting the images many times, but it stayed up. "I didn't get treated like Taylor Swift," Marjorie says. "It makes me wonder: Do you have to be as famous as Taylor Swift to get these fake AI nude photos taken down?"