Facebook has apologised after an investigation exposed inconsistencies by moderators in removing offensive posts reported by the social network's users.The investigation reported by ProPublica this week showed that in one case Facebook censors, called content reviewers,
Facebook apologises for mistakes in removing hate speech
The posts were submitted to ProPublica as part of a crowd-sourced investigation into how facebook implements its hate-speech rules. ProPublica asked Facebook to explain its decisions on a sample of 49 items. People who submitted this items maintained that Facebook censors had erred, mostly by failing to remove hate speech, and in some cases by deleting legitimate expression.
Facebook admitted that its reviewers had made a mistake in 22 cases, but the social network defended its rulings in 19 instances. In six cases, Facebook said that the users had not flagged the conten correctly, or the author had deleted it. In the remaining two cases, Facebook said it did not have enough information to respond.
"We're sorry for the mistakes we have made... They do not reflect the community we want to help build," Facebook Vice President Justin Osofsky was quoted as saying by ProPublica."We must do better," he added. Facebook, according to Osofsky, will double the size of its safety and security team, which includes content reviewers and other employees, to 20,000 people in 2018, in an effort to enforce its rules better, the report said on Thursday.