Famed US film critic Roger Ebert had his Facebook page temporarily taken down earlier this week in a perplexing case of corporate censorship all the more bizarre because it began on Twitter.
Ebert, well known for his fiery opinions and blunt delivery, tweeted on Tuesday about the sudden death of Ryan Dunn, the star of the popular “Jackass” film series. Dunn died, along with another passenger, in a single-vehicle car wreck. News reports speculated that he had been both speeding and, beforehand, drinking (although police have not blamed the latter cause for the accident).
Tweeted Ebert (who is an admitted recovering alcoholic): “Friends don’t let jackasses drink and drive.”
The remark prompted cyber outrage from Dunn’s friends and fans, who took offense and eventually migrated it onto Ebert’s Facebook page. Not long after, the page was taken down, and Ebert received this explanation:
“Facebook!” he tweeted next. “My page is harmless and an asset to you. Why did you remove it in response to anonymous jerks? Makes you look bad.”
Facebook eventually restored the page, insisting the whole episode was a mistake. Many tech bloggers, though, didn’t buy the explanation, and the incident has prompted renewed concerns over how Facebook polices “inappropriate” content.
Ebert’s page appears to have been flagged by his detractors – some of whom likely came over from Twitter – as inappropriate. But how do those flags figure into Facebook’s decision of when to remove content? Asked Jillian York: “What triggered that error? Utter incompetence or automated systems?”
In other words, if enough of us have it out for you — over something you said on another platform, or something you said offline, or nothing you said in particular — could we coordinate a campaign to flag your page until Facebook takes it down?
As Tech blog GigaOm pointed out, Net Delusion author Evgeny Morozov suggests authoritarian regimes may be using a similar tactic, flagging dissident web content as pornography to keep it off the Internet.
In this case, The Atlantic noted, Facebook’s “error” wound up removing legitimate debate around the social conflict between decrying drunken driving and respecting the sensitivities of those related to its victims.
Facebook is, of course, not really a public space, with the free speech protections we’d expect in one (and for that reason, some observers have suggested “censorship” may not be quite the right label to describe what’s going on here). But the site will have some major problems going forward if it turns out individual users have the power to block each other’s content.