AFRICA

Guidelines Revealed On How Facebook Approaches Handling Graphic Content

Published

on

Facebook, one of the biggest and most-used social media platforms of this generation, releases guidelines on its moderation of graphic content.  With many users and much content going around from day to day, it is vital for such a company to have rules on what can and cannot be posted on the platform.

            Andrew Liptak of The Verge writes that The Guardian published a series of reports about Facebook’s internal rules on moderating graphic content, providing new insight into how the company determines what its users can post… The Guardian’s series, Facebook Files, reveal some of the site’s internal manuals concerning credible threats of violence, non-sexual child abuse, graphic violence, and cruelty to animals.”

Although such guidelines may be quite simple, such a path is hard to navigate when intending to promote free speech without broadcasting real-world harm at the same time.  Facebook uses automated systems to eliminate content such as child sexual abuse or terrorism, but other topics that fall in such a “gray-area” are left to teams of moderators to determine.  For instance, casual and unserious threats may be acceptable on the site, but more serious threats must be taken down— “some statements are acceptable to keep on the site (‘I’m going to kill you John!’) and [others] should be removed (‘I’m going to kill you John, I have the perfect knife to do it!’)”

            Liptak adds, “The guidelines ask moderators to determine the difference between someone blowing off steam, and a serious threat, pointing to instances where posts detailing specific threats, timing, and methods are given priority over more general ones.  The site also outlines specific groups of vulnerable individuals (such as heads of state or specific police officers) and groups (homeless or Zionists), which gets automatically deleted or escalated.”

There are also many other significant gray areas that are present.  “Photos and videos documenting animal abuse” are permitted in order to raise awareness about such issues, and self-harm is also permitted because Facebook “doesn’t want to censor or punish people in distress who are attempting suicide.”

With many “iffy” subjects present on the web, Facebook urges its users to report any suspicious material.  With the “report” button available on all posts, users have the freedom to use their judgement to decide which material can stay and which should go.  As users of social media, it is ultimately up to those who have the power to take action in order to protect others from harmful material and posts.

Featured Image via Pixabay

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version