Stop Expecting Facebook to Fix What We Can't Fix in Real Life
There's been a lot of hand-wringing about social media moderation of late.
"Platform X isn't doing enough to stop harmful content!"
"Platform Y is censoring too much speech!"
"Platform Z needs better content moderation!"
But there's something deeply confused about these conversations. We seem to be demanding a standard of moderation from social media platforms that we don't even attempt to achieve in real-world social spaces.
The Public Square Analogy
The most popular argument seems to be that social media platforms are the new public square. And to be fair, it’s an analogy that, in all its egotistical puffery, is propagated and promoted by the platforms themselves. If we take this analogy seriously, we should look at how we actually moderate real public squares.
Go to any park in a major city. You'll find:
- Street preachers warning about eternal damnation
- Political extremists handing out pamphlets
- Conspiracy theorists with signs about lizard people
- Folks having heated arguments about controversial topics
- Individuals spouting objectively false claims about vaccines/history/science
We generally let all of this happen. The only time we intervene is when someone crosses the line into direct harassment or explicit threats of violence. Even then, enforcement is spotty at best.
So when we demand that Twitter or Facebook eliminate all "harmful misinformation" or "toxic content," we're asking them to achieve something we've never managed in physical public spaces. This seems... optimistic.
The Private Venue Model
“But social media platforms aren't really public squares - they're more like private venues, like bars or restaurants. Those places can and should set their own rules!"
Fair enough. No shirt, no shoes, no service. I’ll take that pivot.
But let's think about how moderation actually works in a bar:
- They kick people out for obvious bad behavior (fighting, harassment, extreme disruption)
- They don't monitor every conversation
- They don't fact-check patron's claims
- They don't try to ensure all discussions remain civil and productive
If X, Bluesky or Threads were actually run like a bar, moderators would only step in for the most egregious violations. They wouldn't be expected to referee every heated argument or evaluate the truthfulness of every claim.
But somehow, we expect social media platforms to:
- Monitor billions of posts in real-time
- Make perfect judgment calls about complex issues
- Consistently enforce rules across different cultural contexts
- Prevent the spread of misinformation without impeding legitimate discourse
- Keep everyone safe while preserving free expression
This is like demanding that every bar hire thousands of moderators to hover over each table with a fact-checking database and a conduct rulebook. It’s not only untenable, it’s hardly in the spirit of a good night out.
The Scale Problem
"But social media is different because of its scale and reach!"
Sure - but this cuts both ways. The massive scale of social media means:
- Perfect moderation is technically impossible (billions of posts per day in hundreds of languages)
- Even a tiny error rate produces thousands of visible mistakes
- Any automated system will have false positives that anger users
- Any human system will be inconsistent across moderators
- Context and nuance become exponentially harder to evaluate
The larger the platform, the more impossible it becomes to achieve the level of moderation people seem to want. It's a fundamental scaling problem.
A More Realistic Approach
Maybe we need to adjust our expectations. Instead of demanding an impossible standard of moderation, we could:
- Accept that online spaces, like physical ones, will never be perfectly "clean" or "safe"
- Focus moderation on clear, enforceable rules rather than subjective judgments
- Put more emphasis on user controls (muting, blocking, curating feeds)
- Acknowledge that different platforms can have different standards
- Most importantly, stop treating content moderation as a solution to deeper social problems that are in the too hard basket
I’m reminded of the old joke about the drunk looking for his keys under a streetlight. When asked if that's where he dropped them, he says "No, but the light is better here."
We focus on social media moderation because it seems like something we can control and fix. But maybe we're looking in the wrong place. The "toxicity" and "misinformation" we see online isn't primarily a moderation problem - it's a reflection of underlying social and cultural tensions that we haven't resolved in the real world.
Expecting content moderators to solve problems we can't solve in our schools, town halls, and public squares is a form of magical thinking. It's easier than addressing the root causes, but it's ultimately fucking futile.