Taylor Swift "Furious" Over Deepfake Images, May Take Legal Action: Report

9 months ago 3
ARTICLE AD BOX

Fake, sexually explicit images of American megastar Taylor Swift likely generated by artificial intelligence spread rapidly across social media platforms this week, disturbing her fans. The deepfakes have once again reignited calls from lawmakers to protect women and crack down on the platforms and technology that spread such images.

One image of the singer was seen 47 million times on X, formerly known as Twitter before it was removed on Thursday. According to US media, the post was live on the platform for around 17 hours, reported news agency AFP.

Taylor Swift is reportedly "furious" that AI-generated nude images of her were circulating online and is also weighing possible legal action against the site responsible for generating the photos, according to a report published in The New York Post.

"Whether or not legal action will be taken is being decided but there is one thing that is clear: these fake AI-generated images are abusive, offensive, exploitative and done without Taylor's consent and/or knowledge," a source close to the 34-year-old pop star said.

"The door needs to be shut on this. Legislation needs to be passed to prevent this and laws must be enacted," a source close to the star added.

How Has X Responded

In a statement, X said that "posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content."

The Elon Musk-owned platform said that it was "actively removing all identified images and taking appropriate actions against the accounts responsible for posting them."

It was also "closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed."

The images however continued to be available and shared on Telegram.

US law currently affords tech platforms very broad protection from liability for content posted on their sites and content moderation is voluntary or implicitly imposed by advertisers or the app stores.

Many well-publicized cases of deepfake audio and video have targeted politicians or celebrities, with women by far the biggest targets.

According to research cited by Wired magazine, 113,000 deepfake videos were uploaded to the most popular porn websites in the first nine months of 2023.

Read Entire Article