Facebook’s content decisions are increasingly its brand. In the Oversight Board it has a backstop for the kind of decisions the company cannot scale and smoothly enact across the world: which politicians’ accounts to take down and which to leave up; how to deal with certain persistent harassers who are ruled not to breach policy; whether to take down offensive material in one country but leave it up in another. And, most importantly, it is a mechanism through which to respond to the kind of public pressure that is a drain on management time, provokes congressional hearings, and upsets employees. To date, Facebook’s attempts to produce a “scalable” content moderation strategy for global speech has been a miserable failure, as it was always doomed to be, because speech is culturally sensitive and context specific.

After a decade of denying Facebook was responsible for, or even capable of, making content decisions beyond the broadest sweep of generalised rules, Mark Zuckerberg has shifted the company more into the territory of all historic media powers: making arbitrary decisions on hot topics in step with the prevailing cultural and political forces of the time. Ultimately, there is no other way that Facebook can operate, but accepting the position means abandoning some core beliefs.

Facebook is not a news company – it employs no reporters – but it is a news-driven company. Two years ago, I asked a Facebook executive who was actually responsible for coming in every morning and worrying about the global news cycle, the election pressures, the trending stories, the regional sensitivities. I got a long answer that essentially boiled down to: parts of many departments, led by policy. However, Facebook having any pre-emptive alarm for sensitive situations was unusual, until the pandemic and the US election changed attitudes.



READ SOURCE

READ  Sam Newman resigns from Channel Nine after comments about George Floyd

LEAVE A REPLY

Please enter your comment!
Please enter your name here