X blocks searches for “Taylor Swift” after explicit deepfakes go viral
X has disabled Taylor Swift-related searches on its platform in an attempt to curb the spread of fake pornographic images in the likeness of the singer that began circulating on social media last week.
Since last Sunday, searches for “Taylor Swift” on X have returned the error message, “Oops, something went wrong.” X blocked the search term after pledging to remove the deepfake AI-generated images from the platform and take “appropriate actions” against accounts that shared them.
“Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content,” X said Friday in a post on its official Safety account.
Still, some false images of the pop star continue to circulate the social network, with some bad actors bypassing the search block by manipulating search terms, such as adding words between the entertainer’s first and last name, CBS MoneyWatch observed in a test of X’s internal search engine.
Reached for comment by CBS MoneyWatch, X replied “Busy now, please check back later.”
The deepfake images last week amassed 27 millions viewers and roughly 260,000 likes in 19 hours, NBC News reported. They also landed on other social networks, including Reddit and Facebook.
The images’ massive reach lays bare an increasingly important issue facing tech companies: How to remove deepfakes, or “synthetic media” images, from their platforms. More than 95,000 deepfake videos were disseminated online in 2023, a 550% increase from the number of false videos circulating the internet in 2019, according to cybersecurity firm Home Security Heroes’ latest report.
Thanks for reading CBS NEWS.
Create your free account or log in
for more features.