Facebook will allow users to appeal decisions to remove their content from the social network, the company has announced.
Beginning later this year, Facebook said it will enable users to lodge an appeal if a photo, video or post is removed having been judged to violate the site’s community standards.
The social network said it would initially allow appeals for content that were removed for nudity or sexual activity, hate speech or graphic violence.
Facebook said a new option would appear on removal notifications that would enable users to request a review, which would be carried out within 24 hours – by a person, not an algorithm, Facebook said – and the content would be restored if a mistake was found to have been made.
“We believe giving people a voice in the process is another essential component of building a fair system.”
Earlier this year, the platform removed the official pages of far-right group Britain First, as well as those of its leaders Paul Golding and Jayda Fransen, for breaching the site’s rules on hate speech.
Alongside the appeals announcement, the company also published its internal guidelines for enforcing its rules.
It includes guidance on when nudity is acceptable on the platform – such as in paintings or images of breastfeeding – as well guidance on when threats of violence can be deemed credible.
“We decided to publish these internal guidelines for two reasons. First, the guidelines will help people understand where we draw the line on nuanced issues,” Ms Bickert said.
“Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines – and the decisions we make – over time.”
The firm’s chief technical officer Mike Schroepfer is due to face questions from MPs from the Digital, Culture, Media and Sport select committee on Thursday over the site’s business and policy practices, as it continues to face scrutiny in the wake of the Cambridge Analytica scandal.