The papers of the biggest scandal revealed to date about the company are based on interviews with five former employees and internal documents. They are part of a series of disclosures made to the U.S. Securities and Exchange Commission (SEC) and the U.S. Congress by Frances Haugen, a former Facebook product manager who left the company in May.
Ashraf Zeitoon, Facebook’s former head of policy for the Middle East and North Africa, who left in 2017, said the company’s approach to global growth was “colonial,” focused on monetization without security measures.
For more than a decade, Facebook has pushed to become the world’s dominant online platform. It currently operates in more than 190 countries and has more than 2.8 billion monthly users who publish content in more than 160 languages. However, its efforts to prevent its products from becoming vehicles for hate speech, inflammatory rhetoric, disinformation and even incitement to violence did not keep pace with its global expansion.
Shortcomings
Internal documents from the firm – revealed by The New York Times, The Washington Post, Wired, NBC News, ABC and Reuters, among others – show that Facebook knew that it did not hire enough workers with the language skills and knowledge of the events locations needed to identify objectionable posts from users in various developing countries.
The documents also showed the lack of detection algorithms for the languages used in countries that Facebook has deemed to be at higher risk of violence.
Facebook spokeswoman Mavis Jones said in a statement that the company has native specialists around the world to review content in more than 70 languages, as well as experts on humanitarian and human rights issues. As he pointed out, these teams work to stop abuses on the platform in places where there is a greater risk of conflict and violence.
Still, the documents offer detailed examples of how employees have sounded the alarm bells in recent years about problems with the company’s tools – both human and technological – designed to root out or block discourses that violate their own standards.
Violence
As employees warned in the now-released documents, the reported shortcomings limited the company’s ability to deliver on its promise to block hate speech in a variety of places, including countries such as Afghanistan, Yemen, Myanmar, Ethiopia and many others.
Among those countries, the case of India and religious hatred stand out. Research carried out within the social network highlighted the great extent to which anti-Islamic material is widespread on the platform and how content that incited “hatred and violence” was disseminated, something particularly widespread in February 2020, coinciding with the tensions that erupted in New Delhi during which 53 people died.
Among the findings are also those reported by The Washington Post that cite some sources and that refer directly to Zuckerberg.
In late 2020, Facebook researchers concluded that efforts to curb hate speech in the Arab world were not working, a report from Politico indicated.
Ads targeting women and the LGBTQ community were rarely flagged for removal in the Middle East.
In Iraq, where violent clashes between Sunni and Shiite militias were rapidly worsening an already politically fragile country, so-called “cyber armies” fought it by posting profane and prohibited material, including child nudity, on each other’s Facebook pages. an effort to eliminate rivals from the global platform, Politico detailed.
In 2018, UN experts investigating a brutal campaign of killings and expulsions against Myanmar’s Muslim minority Rohingya said that Facebook was widely used to spread hate speech towards them.

David William is a talented author who has made a name for himself in the world of writing. He is a professional author who writes on a wide range of topics, from general interest to opinion news. David is currently working as a writer at 24 hours worlds where he brings his unique perspective and in-depth research to his articles, making them both informative and engaging.