Facebook Apologizes For Inconsistent Hate Speech Enforcement
Facebook has apologized for its uneven enforcement involving hate speech, allowing some offensive content to remain while removing others. A recent report called attention to the discrepancy, showing some offensive posts that were allowed to remain while others were deleted. When asked to review 49 questionable content decisions, Facebook said that there were 22 instances in which its content reviewers made mistakes.ProPublica conducted its own analysis of Facebook's hate speech enforcement, saying it analyzed more than 900 Facebook posts as part of a crowdsourced probe. Of the posts its analyzed, the publication asked Facebook to review and explain the decisions it made on 49 of the items.
The social network defended 19 of those decisions, which the people who submitted them disagreed with, but said its reviewers made mistakes in 22 instances. The rest boiled down to different issues, such as the user deleting the content, the content being flagged incorrectly, or not enough info having been reported for the reviewer to respond.
ProPublica points to some instances in which Facebook has allowed hate speech to remain despite repeated attempts by others to get it removed. One example given was a page called "Jewish Ritual Murder," which was only taken down when the publication questioned Facebook about it.
Critics have accused Facebook of double standards, as well as at times taking down legitimate content while allowing blatantly racist, violent, or otherwise unacceptable content to remain. In a statement to ProPublica about the matter, Facebook VP Justin Osofsky said: "We're sorry for the mistakes we have made — they do not reflect the community we want to help build. We must do better."
In an effort to fix the problem, Facebook plans to hire about 10,000 people, doubling the size of its safety and security team to 20,000 in 2018.
SOURCE: ProPublica