The Oversight Board, a group Facebook created to review its policy decisions, weighed in on a case of misinformation in war-torn Ethiopia Tuesday, warning the company about the risks of allowing hate speech and unverified information to spread freely in conflict zones.
The Oversight Board reviewed a post in Amharic from an Ethiopia-based Facebook user that claimed the Tigray People’s Liberation Front (TPLF) was responsible for murder, rape and looting in Raya Kobo and other population centers in the country’s Amhara region, aided by Tigrayan civilians.
“While the user claims his sources are previous unnamed reports and people on-the-ground, he does not even provide circumstantial evidence to support his allegations,” the Oversight Board wrote in its evaluation.
“Rumors alleging that an ethnic group is complicit in mass atrocities, as found in this post, are dangerous and significantly increase the risk of imminent violence.”
The post was initially detected by Facebook’s automated content moderation tools and removed when the platform’s Amharic language content review team determined it violated the platform’s rules against hate speech. Facebook reversed its own decision and reinstated the content after the case was escalated to the Oversight Board.
The Oversight Board overturned Facebook’s decision to reinstate the post, citing a violation of Facebook’s rules against violence and incitement rather than the hate speech rules the platform had cited previously. In its decision, the group expressed concern that the spread of unverifiable rumors in violence-stricken areas like Ethiopia could “lead to grave atrocities, as was the case in Myanmar.”
The month, a group of Rohingya refugees in the U.S. filed a $150 billion class-action suit against Meta, alleging that Facebook’s entrance into the country served as a “key inflection point” in the genocide of the Rohingya people. Misinformation stoking ethnic violence in Myanmar spread broadly on Facebook, often sown by military personnel, and is widely believed to have escalated ethnic violence that targeted and displaced the country’s Muslim minority.
Facebook whistleblower Frances Haugen has cited algorithmically-amplified ethnic violence in countries like Myanmar and Ethiopia — and Meta’s failure to adequately address it — as one of the platform’s biggest dangers. “What we saw in Myanmar and are now seeing in Ethiopia are only the opening chapters of a story so terrifying, no one wants to read the end of it,” Haugen told Congress in October.
The Oversight Board also instructed Meta to order an independent human rights assessment on Facebook and Instagram’s role in exacerbating the risk of ethnic violence in Ethiopia and to evaluate how well it can moderate content in the country’s languages.
Last month, Meta defended the safety precautions it had taken in the country, highlighting expanded applications of some of its rules against misinformation and hate speech. The company also noted that it had improved its enforcement abilities there in the past two years and now has the capability to review content in Amharic, Oromo, Somali, Tigrinya, the four most common languages.