Based on interviews with five former employees and internal company documents reviewed by Reuters, Facebook employees have warned for years that as the company races to become a global services company, it has failed to supervise abuses in countries where such remarks may cause the most harm abusive content.
For more than a decade, Facebook has been trying to become the world’s dominant online platform. It currently operates in more than 190 countries, has more than 2.8 billion monthly users, and publishes content in more than 160 languages. But its efforts to prevent its products from becoming channels for hate speech, inflammatory speech and misinformation — some of which are accused of inciting violence — have not kept up with its global expansion. Internal company documents reviewed by Reuters show that Facebook already knows that it does not employ enough employees who have both language skills and knowledge of local events needed to identify offensive posts by users in some developing countries.
The documents also show that the artificial intelligence systems Facebook uses to eradicate such content are often inadequate. And the company hasn’t made it easy for its global users to flag posts that violate the site’s rules. Employees warned in the document that these shortcomings may limit the company’s ability to fulfill its promise to block hate speech and other illegal posts from Afghanistan to Yemen. In a comment posted on Facebook’s internal message board last year about the company’s way of identifying abuse on its website, an employee reported that in certain countries at risk of real-world violence, particularly Myanmar and Ethiopia, there are ” Significant gap”.
News Source : Gadgets 360