Taylor Swift’s Nude Deepfakes Prompt US Officials to Roll Out Stricter AI Rules

Taylor Swift also became a victim of deepfakes after nude photos of her made the rounds on social media. The incident prompted experts to make stricter rules for using artificial intelligence (AI).

Taylor Swift Becomes Deepfake Victim

Researchers have criticized US officials for failing to implement stronger AI regulations before pop sensation Taylor Swift fell prey to deepfakes. Before being taken down, pictures of the four-time Grammy winner engaging in a string of sex acts in the stadium and while wearing Chiefs apparel were viewed 47 million times on the internet.

According to a lecturer at George Washington University Law School, Swift and others would not have suffered from this kind of abuse if appropriate legislation had been "passed years ago."

"We are too little, too late at this point," Mary Anne Franks said. "It's not just going to be the 14-year-old girl or Taylor Swift. It's going to be politicians. It's going to be world leaders. It's going to be elections."

At a New Jersey high school, a group of adolescent females was the target of deepfake images when their male peers began posting pictures of them in their underwear in group chats.

According to reports, on Oct. 20, one of the lads discussed it with a classmate in the group chat, who then brought it up with the officials. According to a mom, her 14-year-old daughter started crying, and she reportedly saw other girls at Westfield High crying when she was walking in the hallways.

However, lawmakers didn't press for action until deepfake images of Taylor Swift became widely circulated.

The platform removed Swift's graphic deepfake photos shared on X, formerly Twitter, for breaking its policy. However, it was too late. Photographs had already been reposted 24,000 times.

After people made jokes about how Swift's photos became viral, a 404 Media article showed the images may have come from a Telegram group.

In addition to monitoring the situation and deleting the photographs, X stated that its teams were taking "appropriate action" against the accounts that posted the deepfakes.

Stricter AI Rules After Taylor Swift's Deepfakes

Shortly after Swift fell victim to the technology, US senators presented the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024 (DEFIANCE Act) last week. The Preventing Deepfakes of Intimate Images Act, which was proposed by politicians the previous year and would forbid the sharing of deepfake pornography without consent, has not yet been passed.

Representative Joseph Morelle (D-New York), who unveiled the Preventing Deepfake of Intimate Images Act, called for other lawmakers to step up and take urgent action against rising deepfake images and videos.

Images and videos "can cause irrevocable emotional, financial, and reputational harm," Morelle said, adding that it "disproportionately affects the female population."

Senate Majority Whip Dick Durbin (D-Illinois) echoed the same sentiment. According to Durbin, victims could lose their jobs and suffer depression or anxiety.

"Although the imagery may be fake, the harm to the victims from the distribution of sexually explicit 'deepfakes' is very real," he explained.

"By introducing this legislation, we're giving power back to the victims, cracking down on the distribution of 'deepfake' images, and holding those responsible for the images accountable."

According to Franks, a professor at George Washington University Law School and president of the Cyber Civil Rights Initiative, we might not be in this situation if the law had been approved years ago when advocates warned this is likely to happen with this kind of technology. Lawmakers, according to Franks, are acting too little, too late.

"We can still try to mitigate the disaster that's emerging," Franks said.

According to a 2023 study, the production of doctored photos has increased by 550 percent in the previous five years, with 95,820 deepfake movies being uploaded to the internet in only the last year.

Seventy-five percent of respondents to a Dailymail.com/TIPP study agreed that sharing deepfake pornographic photos online should result in criminal penalties.

Deepfake technology manipulates a person's face or body using artificial intelligence, and as of right now, no federal regulations exist to prevent individuals from sharing or producing such photos.

Check out more news and information on Technology in Science Times.

Join the Discussion

Recommended Stories

Real Time Analytics