Monday 22 May 2017

Secret Documents: Facebook Appears to Put Features, Profit Above Users’ Safety

Facebook, the 13-year-old behemoth with 1.23 billion active users, generates over $8 billion per quarter in revenue — $3 billion of that is net income (e.g., profit).

But with so many users, Facebook appears to have left its users’ safety as a secondary concern. Until earlier this year, it employed only 4,500 people to review content. Which sounds like a lot of people until you realize that those 1.23 billion active users are sharing billions of pieces of content every day, with tens of millions of reported pieces of content each and every day.

Does Facebook have a serious user safety problem on its hands? A just-published Guardian review of secret, internal documents suggests its problem is out of control.

Facebook acknowledged it has some sort of problem when earlier this year it agreed to nearly double its review staff — to 7,500 — amid allegations that the company simply doesn’t do (or maybe care?) enough when potentially harmful content is trafficked across its platform. You don’t double your moderating staff if everything is hunky dory.

Now, The (UK) Guardian has published astonishing details of Facebook’s content moderation guidelines, culled from more than 100 secret, internal training manuals and documentation. It’s a little disturbing all of this important stuff isn’t in a single manual given to Facebook’s moderators when they’re hired. Instead, it very much appears to be a piecemeal approach to policy, resulting in what appears to be conflicting information, uneven moderation, and very little reliability in how its moderating policies actually work in the real world.

“Facebook cannot keep control of its content,” said one source. “It has grown too big, too quickly.”

Many moderators are said to have concerns about the inconsistency and peculiar nature of some of the policies. Those on sexual content, for example, are said to be the most complex and confusing.

As just one tiny example of how much content Facebook moderates, the articles notes that “One document says Facebook reviews more than 6.5 million reports a week relating to potentially fake accounts – known as FNRP (fake, not real person).” And that’s just for fake accounts. Imagine how many millions more reports are for actual content.

Features Before Thought

Before any thought is given to how people might use (and abuse) a new feature, Facebook appears to favor rolling stuff out and figuring out how to moderate it as an afterthought. Look at Facebook Live, a video streaming service that allows people to video whatever’s happening in their life in the moment. What did they think people would eventually use it for?

Facebook moderation guidelines“At the same time Facebook’s live streaming feature has been used to broadcast murders and suicides, with the company apparently unable to preemptively shut off streams.” Duh! Most companies would probably think, before launching something like this, “Hey, we need a way to immediately report such problematic videos and have them interrupted.” Apparently, not Facebook.

This shows a consistent lack of thinking through the problems and proactively addressing them before they arise. Or erring on the side of “Hey, let’s roll with this and see what kind of outcry it gets before we do anything about it.” (All in the name of “free speech,” of course, ignoring the fact that Facebook is a global platform.)1

Where Did I Put That Moderation Guideline Again?

Facebook moderation guidelinesKeeping your community safe starts with hiring enough moderators to make their jobs enjoyable and not overwhelming. Clearly, Facebook has failed to do this while enjoying its record user growth, revenues, and profits. But it also includes having empowered community moderation leadership who ensure everything they need to do their jobs is all in one place. Facebook, only now, is apparently thinking more deeply about the need for better tools, according to their spokesperson: “In addition to investing in more people, we’re also building better tools to keep our community safe,” she told The Guardian. “We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.”

And maybe put them all in a single manual? So moderators know where to look and have every guideline be consistent?

Self-harm videos are perfectly fine in Facebook’s world, because the person is “in distress.” So is animal cruelty (maybe because the animal is in distress?). “Revenge porn” is also okay, if the moderator can’t confirm a lack of consent by both parties (which, I imagine, could be pretty hard to do in a timely manner).

“Some photos of non-sexual physical abuse and bullying of children do not have to be deleted or “actioned” unless there is a sadistic or celebratory element.” As long as you keep sadism and celebrations out of your violent imagery and videos, Facebook will apparently allow them.

According to the documents, “Videos of violent deaths are disturbing but can help create awareness. For videos, we think minors need protection and adults need a choice. We mark as ‘disturbing’ videos of the violent deaths of humans.” It’s not clear where suicidal content — such as live videos or threats — falls into Facebook’s moderation policies, but it appears to be allowed.

Facebook Needs to Prioritize User Safety

Some have called on Facebook being more heavily regulated, since it is the gatekeeper of all of this content (much like a television broadcaster is the gatekeeper of content on their network):

A report by British MPs published on 1 May said “the biggest and richest social media companies are shamefully far from taking sufficient action to tackle illegal or dangerous content, to implement proper community standards or to keep their users safe”.

Erring on the side of “free speech” may seem like a good idea, especially for an American company. But social networks are primarily online communities — they are not much like newspapers. Facebook’s primary purpose is to connect people to one another. It follows that their guidelines would be more reflective of this diversity and for the safety of that community, rather than some sort of news publishing mantra or empty ideology.

We’d like to see Facebook be more transparent about the ways it moderates millions of people and pieces of content every day. Shedding greater light on these issues helps point out problems and gives Facebook the opportunity to improve their community’s safety. And we’d like to see Facebook plow a lot more of its income into ensuring it has sufficient staff to keep their platform safe. For all users.

 

For further information

Read the Guardian article: Revealed: Facebook’s internal rulebook on sex, terrorism and violence

Footnotes:

  1. Oh, and note that if you have 100,000 more followers on Facebook, in their eyes that makes you a “public figure” — and not subject to the privacy or other protections of ordinary people.


from World of Psychology https://psychcentral.com/blog/archives/2017/05/22/secret-documents-facebook-appears-to-put-features-profit-above-users-safety/

No comments:

Post a Comment