If X doesn’t protect Taylor Swift, it won’t protect you.

Futurism: Leave it to X-formerly-Twitter to find the dumbest possible solution to a serious problem.

Last week, outrage rightfully broke out across the web when horrific AI-generated pornography of the singer and songwriter Taylor Swift went viral on the platform — the latest content moderation scandal to hit the beleaguered social media site following owner Elon Musk's many slashes to X's once-robust trust and safety workforce. The incident has been so alarming that it's prompted lawmakers in Washington, the White House included, to weigh in on the dangers of AI deepfakes and nonconsensual AI porn.

Meanwhile, what measures has Elon Musk's X, the platform where these nonconsensual images were able to go viral, taken to combat the problem? It's simply blocked searches for "Taylor Swift" across the board. Search her name, and all you get is a "something went wrong" error code. Which is... well, one way to do it.

The spread of AI-generated, nonconsensual pornography depicting Taylor Swift on Musk's vanity social media platform - and X’s subsequent failure to act - begs a harrowing question: If this is the best response a giant social media platform can muster to protect a high-profile figure like Taylor Swift, a billionaire white celebrity with immense resources and public influence, what hope is there for the average person targeted by digital exploitation?

The inadequacy of X's response – simply blocking searches for Taylor Swift's name – is entirely superficial and utterly useless. But it’s also indicative of a larger problem in the industry's approach to content moderation and user protection.

When a platform struggles (or simply refuses) to protect someone with Swift's resources, it shows the vulnerability of us all. Inevitably, the risks of AI misuse, deepfakes and nonconsensual pornography will disproportionately affect marginalized communities, including women, people of colour, and those living in poverty. These groups lack the resources to fight back against digital abuse, and their voices will not be heard when they seek justice or support.

Platforms like X have a social responsibility to foster healthy digital spaces where no one faces a violation of consent or dignity. Anything less than that is cowardice or complicity.

Subscribe to Joan's Index.

I publish a weekly email, featuring one tech essay, + 3 curations, recommendations and ideas. Always free, never ad-supported.

Email Address

Sign Up

I respect your privacy. I don’t respect Elon Musk.

Thank you!

@Westenberg logo
Subscribe to @Westenberg and never miss a post.