Users cant search for Taylor Swift on X anymore, thanks to a flood of deepfake content.

Users cant search for Taylor Swift on X anymore. (Photo by Patrick Smith/GETTY IMAGES NORTH AMERICA/Getty Images via AFP).

X blocks searches for explicit Taylor Swift deepfakes. But what about everyone else?

  • Deepfake content is affecting everyone.
  • X has blocked the search results for Taylor Swift.
  • As explicit deepfake images of Taylor Swift started spreading on social media platforms, her fans were quick to come to the singer’s support.

Deepfake content is becoming a huge problem. And the number of victims that are having their profiles misused by deepfake content is also increasing.

According to a report by Sumsub, deepfake content has seen a tenfold increase globally across all industries from 2022 to 2023. North America has the biggest surge with a 1740% increase, followed by the Asia Pacific region at 1530%. Europe recorded a 780% jump while MEA and Latin America also witnessed a 450% and 410% surge in deepfake content.

Why the surge in deepfake content?

What’s causing this surge? For a start, creating deepfake content is cheap. While some deepfake content generator apps require a small subscription fee, there are also apps that offer it for free.

While there was no harm in this initially, many were soon beginning to use deepfake content generators for the wrong reasons. Apart from politicians, celebrities were also constantly targeted with deepfake content, with many having to go on record on social media to explain that their images had been used without their consent.

Most of the content is used to falsely endorse products, to sway fans of the start towards using those products. But there have been those that have been used to spread misinformation among the public. Politicians especially have been targeted, spreading fake news, misinformation and disinformation.

The biggest problem with deepfake content though is the inevitability of its use in pornography. Today, a user would just need to take a picture of anyone, upload it onto a deepfake app, and get it to produce explicit images within minutes.

X has blocked the search for Taylor Swift.

X has blocked the search results for Taylor Swift.

Deepfake content and Taylor Swift

The rise of deepfake content is also one of the main reasons why governments are calling for more regulations on AI. Currently, there is no law that stops anyone from generating deepfake content, be it explicit or otherwise. While some copyright laws can be used to fine those who use such content fordishonest  marketing purposes, the impact is still minimal and the profitable damage is already done.

While there have been hundreds of celebrities who’ve had their images compromised by deepfake apps, one in particular has been getting so much attention that – in an almost unprecedented example of it doing the decent thing – social media platform X is actually blocking searches for such images.

When explicit deepfake images of Taylor Swift started spreading on social media platforms, her fans were quick to come to the singer’s support, with many reporting the images and flooding social media with different pics to block the spread of the pornographic content. But as usual, the explicit images still manage to find their way to many on the platform.

As such, X has blocked users from searching for Taylor Swift. According to a report by Reuters, searches for Swift’s name on the social media platform yielded the error message, “Something went wrong. Try reloading.”

“This is a temporary action and done with an abundance of caution as we prioritize safety on this issue,” Joe Benarroch, head of business operations at X, said. But it may be a little too late as a New York Times report stated that one explicit deepfake image of Swift shared on X was viewed 47 million times before the account was suspended.

White House Press Secretary Karine Jean-Pierre called the fake images “alarming,” and said social media companies have a responsibility to prevent the spread of such misinformation. Jean-Pierre said at a news briefing that lax enforcement against false images, possibly created by AI, disproportionately affects women.

When explicit deepfake images of Taylor Swift started spreading on social media platforms, her fans were quick to support the singer.

Taylor Swift, who temporarily doesn’t exist in X search results. Just as well she doesn’t have a new album out… (Photo by Michael TRANAFP).

What about everyone else?

Most social media platforms do not allow users to upload explicit images. However, deepfake content does not necessarily have to be explicit. The content can be of any kind, and it’s becoming much harder for technology to easily detect.

While the move by X to block the search of those keywords is welcome, what about other celebrities and individuals that have been victimized by such content? Should the platform just ban all forms of nudity or pornographic content from being uploaded? Elon Musk, who owns X, believes that the social media platform is in the business of promoting free speech – even, where it’s consensual – nude speech.

Taylor Swift is just one of the many thousands of individuals who have found deepfake content of themselves circulating on apps like X. As stopping and detecting them is becoming harder, perhaps the best thing for X to do is to ban such content from being uploaded – if it can.

Innovations in AI are making it harder to detect deepfake content. Apart from images, deepfake videos and audio recordings are also a big concern for many. The call for regulation continues, but how much of an impact will regulations have on such content? Only time will tell.

At the end of the day, social media platforms need to have a collective responsibility for banning and blocking such content on their apps.

Here’s hoping the singer can shake off the latest form of hater.