Facebook has released a rule book for the types of posts it allows on its social network.
Tuesday’s announcement of plans to release the internal guidelines was complemented by the addition of an appeals process for removed content globally over the next few months. The appeals process would decide what posts go too far in terms of hateful or threatening speech.
"This is part of an effort to be more clear about where we draw the line on content," Facebook public policy manager in charge of content Siobhan Cummiskey told AFP. "And for the first time we’re giving you the right to appeal our decisions on individual posts so you can ask for a second opinion when you think we’ve made a mistake."
Indicating the social media giant’s new policy is an evolving document based on feedback from stakeholders, Monica Bickert, Facebook's head of product policy and counterterrorism, said the updates go out to the content reviewers every week and the company hopes it will give people clarity, if posts or videos they report aren't taken down. The challenge is to have the same document guide with enormously different "community standards" around the world, she pointed out. What passes as acceptable nudity in Norway may not pass in Uganda or the U.S. she said.
The world’s largest social network has become a dominant source of information in many countries around the world, and uses a combination of the human reviewers and artificial intelligence to erase content that violates its policies. According to Cummiskey, there are nearly 7,500 content reviewers that are part of a 15,000-person team, that is now expected to grow to 20,000 people by the end of this year, devoted to safety and security of its users and the community they live in.
The newly released standards are a stark departure from the social platform's prior guidance, which had been designed to express the company's values and priorities in a way that did not overwhelm readers.