- Facebook has revealed the previously secret guidelines it uses to decide what posts are allowed on the site.
- It is also introducing an appeals process for when people think content has been removed incorrectly.
- Facebook hopes to receive feedback to improve the guidelines.
Facebook has published the internal guidelines it uses to decide who is allowed to post to the site. According to an official blog post, the guidelines have been published to help people understand where the company stands on nuanced issues.
Facebook says that the standards are an evolving document and that it hopes to receive feedback to improve them. However, what will not change are the principles of “safety, voice, and equality” that the guidelines are based on.
Facebook is also going to launch an appeals process for posts that have been removed. Starting with those that were taken down due to nudity, hate speech, or graphic violence, users will be given the option to request an additional review. This review will be made by a human, not an AI system, and should take place within 24 hours. If Facebook agrees that a mistake has been made, the post will be restored.
Facebook has long been criticized for a seeming lack of consistency and transparency when it comes to what it allows on the site. It would, therefore, seem like a good thing that it is allowing its users a look at the thought process that goes into banning certain types of content.
However, the guidelines are unlikely to keep everyone happy. Many of the rules seem open to interpretation. For example, in the Credible Violence section, instructions on how to make or use explosives are banned, “unless there is a clear context that the content is for a non-violent purpose.”
Nudity is also banned, but in cases where it is shared as a form of protest, to raise awareness about a cause, or for educational or medical reasons, it may be allowed. Additionally, Facebook says that photographs of paintings, sculptures, and other art that depicts nude figures is also allowed.
Facebook isn’t the only web giant making changes to the way it moderates content. A recent report suggested that YouTube removed 8.3 million videos in the last three months that it deemed as inappropriate.
April 25, 2018 at 01:22AM