Facebook had brought in emergency measures to curb misinformation towards the end of the 2019 Lok Sabha elections. These measures — known as ‘break-the-glass’ steps internally, a reference to fire alarms — were brought on partly in response to higher user reports.
Internal documents describe how the company saw, ahead of the last round of polling, what it describes as an “escalation” from Bengal — specifically, videos that talked of an “alleged Hindu exodus from certain areas under threat from Muslims.”
‘A few days before round six of polling, an out-of-context West Bengal road accident video exhibited signs of virality. Captions in the violating posts depicted Bangladeshi and Rohingya migrants in West Bengal as ‘terrorists’ and ‘intruders’ and claimed they were attacking Central Forces’, Facebook employees wrote in an internal report.
The viral video was specifically “being taken out of context in order to target vulnerable populations”, the report said.
These insights and more come from documents that are part of disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by the legal counsel of Facebook whistle-blower Frances Haugen. The documents constitute an array of internal research reports and internal corporate communications that offer a first-ever look at how Facebook and WhatsApp serve as the canvas on which deep-rooted problems of conflict play out in a country.
Over the last few years, Facebook has developed specific measures to curb misinformation during emergency situations. These steps form a key part of its “election playbook” and are based on the company’s experiences from multiple countries, including the United States and Brazil.
These ‘break-the-glass’ measures have been used in conflict-stricken countries to stop bloodshed, including the US.
The ‘India Elections’ case study report noted that for the 2019 polls, two specific measures were deployed by the company. Firstly, Facebook down-ranked “all civic posts in India with a re-share depth of >=2”. And secondly, it reduced the “thresholds” for engagement classifiers in Hindi, English, Tamil and Bengali.
The first measure essentially involves ‘demoting’ certain types of content that have been shared heavily across the platform. Practically speaking, this curbs the spread of the information by ensuring they appear less on a user’s feed.
The second step increases the likelihood of the company’s algorithms acting on specific content in those languages.
The same internal report says that Facebook demoted “all content” from 583 top “civic misinforming” groups in India. The action would reduce “3% of all known misinformation” in India.
One section of Facebook’s case study on the 2019 Indian elections also makes clear the company is aware of the problem of how political parties may be using proxies to get around the limits that are placed on advertisement spending. The study quotes investigations done by Huffington Post and Quartz on the links that an organization called ‘Associations of Billion Minds’ (ABM) and how it has links to pages that appear to be run by ‘normal’ fans of the Prime Minister, but are in fact actually run by ABM.While the BJP has officially distanced itself from ABM, Huffington Post India described it as being is “Amit Shah’s personal election unit” in 2019.