Facebook has different standards for misinformation in India

US social media giant Facebook has invested way less proportionately on preventing hate-speech and misinformation on its platform for users in India, developing and also in some non-English speaking nations compared to the United States, according to US media reports based on internal documents made public on Saturday.

The United States accounts for less than 10% of Facebook’s daily users but it corners 84% of the company’s global budget, The Washington Post reported based in a 2020 company document. Only 16% went for the same purpose in the “Rest of World,” included India, France and Italy.

One Facebook document viewed and cited by The Washington Post showed that the company had not developed algorithms for Hindi and Bengali, the world’s fifth and seventh most widely used language. The report cited company spokesperson saying hate-speech classifiers for Hindi and Bengal were introduced in 2018 and 2019 and systems for detecting violence and incitement in Hindi and Bengali were added as recently as 2021.

Facebook was acutely aware at all times of its loose systems for monitoring and taking down hate-speech and misinformation in India.

The dummy test was called an “integrity nightmare” by the Facebook in an internal document.

Documents cited by Washington Post in this report included copies of internal papers, memo and reports that whistleblower Frances Haugen has provided to the US stock market regulator Securities and Exchange Commission, with redacted versions to US Congress. Some of these papers formed the basis for initial reports by The Wall Street Journal that the social media giant prioritised profit over public safety. The reports led to one of them damaging Congressional hearings yet for Facebook.

In India, Lever said, the “hypothetical test account inspired deeper, more rigorous analysis of our recommendation systems”.

Leave a Comment