From “continuously appearing polarizing nationalist content” to “false or untrue” information, from “misinformation” to content “slandering” ethnic minorities, from 2018 to 2020, Facebook internally raised several relevant issues. Red flags for its business in India. However, despite these clear warnings from employees authorized to assume oversight functions, an internal review meeting with Facebook Vice President Chris Cox in 2019 found that the prevalence of “problematic content (hate speech, etc.)
on the platform is relatively low “. From January to February 2019, a few months before the election of the People’s Court, two reports marked hate speech and “problem content” were submitted. The third report, as late as August 2020, admits that the platform’s AI (artificial intelligence) tools cannot “recognize dialect language” and therefore fail to recognize hate speech or problematic content. However, the minutes of the meeting with Cox concluded: “The survey tells us that people generally feel safe.
Experts tell us that this country is relatively stable.” These obvious response gaps were disclosed in documents that were part of the disclosure to the U.S. Securities and Exchange Commission (SEC) and were provided to the U.S. Congress in editorial form by the legal counsel of former Facebook employee and whistleblower Frances Haugen.
The edited version received by the U.S. Congress has been reviewed by a consortium of global news organizations including India Express. Facebook did not respond to the “India Express” inquiry about the Cox meeting and these internal memos.
The review meeting with Cox was held one month before the Indian Electoral Commission announced the seven-phase timetable for the Lok Sabha election on April 11, 2019.
News Source : The Indian Express