Facebook-Parent meta for “assessing feasibility” of human rights reviews on Ethiopian practices

Facebook owner Meta Platforms violent there on Thursday, when the oversight committee used Facebook and Instagram.

A board of directors established by the company to address criticisms of the handling of problematic material makes binding decisions on a small number of challenging content moderation cases and makes non-binding policy recommendations. Offers.

Meta has been scrutinized by lawmakers and regulators for user safety and handling of abuse on platforms around the world, especially after whistleblower Frances Haugen leaked internal documents. Harm including Ethiopia.

Thousands were killed and millions were evacuated during a year-long conflict between the Ethiopian government and rebels in northern Tigray.

As part of its response to the Board’s December recommendations on cases involving content posted to the country, the social media giant said, “A great resource to Ethiopia to identify and remove potentially harmful content. Invested. “

The oversight committee last month upheld Meta’s initial decision to remove posts claiming the involvement of Tigrinya citizens in the atrocities in the Amhara region of Ethiopia. After the user appealed to the board, Meta restored the post, requiring the company to remove the content again.

Meta said Thursday that he had removed the post, but disagreed with the Board’s reason that it should have been removed because it was an “unconfirmed rumor” that greatly increased the risk of imminent violence. He said this would impose “publication standards for journalism on people.”

A spokesman for the supervisory board said in a statement: “Meta’s existing policies prohibit rumors that contribute to imminent violence that cannot be uncovered in a meaningful time frame, and the board ensures that these policies are effectively applied in conflict situations. I made a recommendation. “

“Rumors claiming that ethnic groups are involved in atrocities, as seen in this case, can cause serious harm to people,” they said.

The Board recommended that Meta complete the human rights due diligence assessment within six months. This includes a review of Meta’s language proficiency in Ethiopia and a review of measures taken to prevent misuse of services in the country.

However, the company said that not all elements of this recommendation “may be feasible in terms of timing, data science, or approach.” It said it needed to continue its existing human rights due diligence and get up-to-date information on whether it could act on the recommendations of the board within the next few months.

Reuters’ previous report on Myanmar and other countries investigated how Facebook struggled to monitor content in different languages ​​around the world. In 2018, a UN human rights investigator said the use of Facebook played an important role in spreading hate speech that fueled violence in Myanmar.

Meta, who said it was too late to prevent false alarms and hatred in Myanmar, said native speakers around the world are now working to review content in more than 70 languages ​​and stop abuse on the platform. I am. Risk of conflict and violence.

The Board also recommended that Meta rewrite the value statement on safety to reflect that online speech can pose a risk to a person’s physical safety and right to life. The company said it would make changes to this value in the partial implementation of the recommendation.

© Thomson Reuters 2022

Stay tuned for the latest updates on the Consumer Electronics Show for Gadgets 360 on the CES 2022 hub. Facebook-Parent meta for “assessing feasibility” of human rights reviews on Ethiopian practices

Back to top button