The son of an Ethiopian chemistry teacher who was killed during riots in the country last year has filed a lawsuit against Meta, the parent company of Facebook, alleging that the social media platform is fueling viral hate and violence, harming people. people in eastern and southern Africa.
Abrham Meareg Amare claims in the lawsuit that his father, Meareg Amare, a 60-year-old academic from Tigrayan, was shot dead outside his home in Bahir Dar, the capital of Ethiopia’s Amhara region, in November 2021, after of a series of hate attacks. messages were posted on Facebook defaming and misleading the professor, calling for his assassination.
The case is a constitutional petition filed with the Kenyan High Court, which has jurisdiction over the issue, as Facebook’s content moderation operations center for much of eastern and southern Africa is located in Nairobi.
It accuses Facebook’s algorithm of prioritizing dangerous, hateful and inflammatory content for engagement and ad revenue in Kenya.
“They have suffered human rights violations as a result of Respondent’s failure to remove Facebook posts that violated the Bill of Rights, even after filing reports with Respondent,” the complaint read.
The legal filing alleges that Facebook failed to adequately invest in content moderation in countries in Africa, Latin America and the Middle East, particularly from its hub in Nairobi.
He also claims that Meta’s failure to address these basic security issues has fanned the flames of Ethiopia’s civil war.
In a statement to CNN, Meta did not directly respond to the lawsuit:
“We have strict rules outlining what is and is not allowed on Facebook and Instagram. Hate speech and incitement to violence are against these rules and we invest heavily in equipment and technology to help us find and remove this content. Our security and integrity work in Ethiopia is guided by feedback from local civil society organizations and international institutions.”
Meareg said her father was followed home from Bahir Dar University, where he had worked for four years running one of the country’s largest laboratories, and was shot twice at point-blank range by a group of men.
He said the men chanted “junta”, echoing a false claim circulating about his father on Facebook that he had been a member of the Tigray Popular Liberation Front (TPLF), which has been at war with the Ethiopian federal government for two years. years. years.
Meareg said she had desperately tried to get Facebook to remove some of the posts, which included a photo of her father and his home address, but says she did not receive a response until after he was killed.
An investigation into the murder by the Ethiopian Human Rights Commission, included in the file and seen by CNN, confirmed that Meareg Amare was killed at his residence by armed assailants, but his identity remained unknown.
“If Facebook had stopped the spread of hate and moderated the posts correctly, my father would still be alive,” Meareg said in a statement, adding that one of the posts calling for her father’s death was still on the platform.
“I will take Facebook to court, so that no one suffers like my family has ever suffered again. I seek justice for millions of my fellow Africans hurt by Facebook speculation, and an apology for my father’s murder.”
Meareg is initiating the lawsuit with a legal adviser and former Ethiopian researcher at Amnesty International, Fisseha Tekle, and Kenyan human rights group, the Katiba Institute.
The plaintiffs are asking the court to order Meta to reduce violent content, increase content moderation staff in Nairobi and create a restitution fund of around $1.6 billion for victims of hate and violence incited on Facebook.
Ethiopia is an ethnically and religiously diverse nation of approximately 110 million people who speak dozens of languages. Its two largest ethnic groups, Oromo and Amhara, make up more than 60% of the population. The Tigrayans, the third largest, are about 7%.
A Meta spokesperson said the company’s security policies and work in Ethiopia are guided by feedback from local civil society organizations and international institutions.
“We employ staff with local knowledge and experience, and continue to develop our capabilities to detect infringing content in the most widely spoken languages in the country, including Amharic, Oromo, Somali and Tigrinya,” the spokesperson said in a statement.
According to Meareg’s presentation, Meta only has 25 employees who moderate the major languages in Ethiopia. CNN was unable to independently confirm this number, and Facebook won’t reveal exactly how many local language speakers are reviewing content in Ethiopia that has been flagged as a possible violation of its standards.
The lawsuit has been filed after two years of a devastating conflict in Ethiopia, which has left thousands dead, more than displaced 2 million people and gave rise to a wave of atrocities, including massacres, sexual violence and the use of hunger as a weapon of war. A United Nations report last year it found that all parties to the conflict had “committed violations of international human rights, humanitarian and refugee law, some of which may amount to war crimes and crimes against humanity”, to varying degrees.
The Ethiopian government and TPLF leadership agreed to cease hostilities in November, pledging to disarm fighters, provide unimpeded humanitarian access to Tigray, and a framework for justice. But surprise truce it has left many questions unanswered, with few details on how it will be implemented and monitored.
It’s not the first time Meta has come under scrutiny for its handling of user safety on its platforms, particularly in countries where online hate speech is likely to spread offline and cause harm. Last year, whistleblower Frances Haugena former Facebook employee, told the US Senate that the platform’s algorithm was “literally fueling ethnic violence” in Ethiopia.
Internal documents provided to Congress in redacted form by Haugen’s legal counsel, and seen by CNN, revealed that Facebook employees had repeatedly sounded the alarm about the company’s failure to curb the spread of posts inciting violence. in “at risk” countries like Ethiopia. The documents also indicated that the company’s moderation efforts were no match for the deluge of inflammatory content on its platform and that, in many cases, it failed to adequately expand staff or add local language resources to protect people on the Internet. these places.
Last year, Meta Independent Oversight Board recommended the company commission a human rights due diligence assessment on how Facebook and Instagram have been used to spread hate speech and misinformation, increasing the risk of violence in Ethiopia.
Rosa Curling, director of Foxglove, a UK-registered legal non-profit supporting the case, compared the role Facebook has played in fanning the flames of the Ethiopian conflict to that of radio in inciting genocide in Rwanda.
“The consequences of the information on Facebook are so tragic and horrifying,” Curling said. “(Facebook) are not taking any action themselves. They are aware of the problem. They are choosing to prioritize their own benefit over the lives of Ethiopians and we hope this case will prevent that from continuing.”
Facebook has also been accused of allowing posts to stoke violence in other conflicts, notably in Myanmar, where the UN said the social media giant had promoted violence and hatred against the minority Rohingya population. A lawsuit seeking to hold Meta accountable for its role in the burma The crisis was filed in a California court last year by a group of Rohingya refugees seeking compensation of $150 billion.
The social media company acknowledged it didn’t do enough to prevent its platform from being used to fuel bloodshed, and Chief Executive Mark Zuckerberg penned an open letter apologizing to activists and vowing to step up their efforts. of moderation.