Google Search for Web:

Kajal Agrawal

RSS, West Bengal and Duplicate Accounts: What the Facebook Whistleblower Complaint Touches Upon Featured

  05 अक्टूबर 2021

Frances Haugen cited internal company documents to provide a brief glimpse into how the company is grappling with issues that have dominated debates around social media in India over the past seven years.

New Delhi: Facebook has struggled to crack down on “anti-Muslim” narratives and other similar fear-mongering content put out by “Rashtriya Swayamsevak Sangh (RSS)” users, groups and pages, according to internal company documents cited by former employee and whistleblower Frances Haugen in her complaint filed with the US Securities and Exchange Commission (SEC).

In the complaint on how the social media platform “promotes global division and ethnic violence”, lawyers for Haugen cite internal company documents to claim that “political considerations” prevented Facebook from categorising or providing a designation to “this group”, in what is a reference to whether greater monitoring was required for RSS-connected content.

Haugen, a data scientist who worked at Facebook until May 2021, has thrown the social media company into its biggest reputational crisis since Cambridge Analytica and will testify before US senators in Washington on Tuesday.

Broad references to India are scattered across at least four of the eight complaints filed with the SEC. These complaints were made public by CBS News on Monday night, a day after the media organisation interviewed Haugen on Sunday night in its ’60 Minutes’ programme.

The complaints are written by non-profit legal organisation Whistleblower Aid, which has identified itself as representing Haugen.

 

Put together, they provide a brief glimpse of issues that speak to the raging debate that has taken place in India over how social media platforms like Facebook influence democratic discourse. The internal company documents cited by Haugen and her lawyers are linked by a common theme – namely, that Facebook officials are aware of the structural factors that cause the spread of hate speech and harmful political rhetoric on its platform.

An examination of the complaints also will lead readers to a sobering conclusion: that the corporate documents and studies that are cited provide only a tantalising glimpse of what is allegedly Facebook’s own assessment of what is perhaps the most hotly debated issue of the social media age.

Social media apps on a phone. Representative image. Photo: dole777/Unsplash

India is Tier-0, but…

In the complaint on Facebook’s international operations, Haugen’s lawyers cite internal company documents to reveal that India is designated as a ‘Tier-0’ country when it comes to the attention that is paid during crucial election cycles. Only two other countries are classified as such: Brazil and the US.

This is good news, because Tier-2 and Tier-3 countries apparently at one point received no investment in terms of proactive monitoring and specific attention during their electoral periods.

On the other hand, this categorisation may not be as meaningful as it appears because a separate notation on “misinformation” implies that the US receives 87% of whatever resources are available, while the “rest of the world” receives just 13%.

This is remarkable because the US and Canada together make up only 10% of the company’s daily active users.

Credit: Whistleblower complaint to SEC.

Credit: Whistleblower complaint to SEC.

Credit: Whistleblower complaint to SEC.

Photo: Whistleblower complaint to SEC.

Why is this important? Perhaps because the most eye-popping statistic that emerges from the complaints is that Facebook apparently is only able to crack down on 3-5% of hate speech and 0.2% of ‘violence and inciting’ content on the platform.

Hate speech and classifiers

In the same complaint, Haugen’s lawyers cite an undated internal company document called ‘Adversarial Harmful Networks – India Case Study’ to show that the company is well aware of the issue of anti-Muslim content put out by pro-RSS users, pages and groups.

“Anti-Muslim narratives targeted pro-Hindu populations with [violent and incendiary] intent… There were a number of dehumanizing posts comparing Muslims to ‘pigs’ and ‘dogs’ and misinformation claiming the Quran calls for men to rape their female family members,” the internal document reportedly says.

The document also lays blame on the company’s lack of technical ability to track this type of content in local Indian languages: “Our lack of Hindi and Bengali classifiers means much of this [anti-Muslim] content is never flagged or actioned.”

Credit: Whistleblower complaint to SEC. CBS News.

Photo: Whistleblower complaint to SEC. CBS News.

 

 

Credit: Whistleblower complaint to SEC. CBS News.

Photo: Whistleblower complaint to SEC. CBS News.

‘Classifiers’ is the term used internally by Facebook to refer to its hate-speech detection algorithms. The document cited by Haugen’s lawyers is undated in the SEC complaint, so it’s not clear if this is a continuing problem. In recent years, Facebook has publicly stated that four official Indian languages are covered with regard to hate speech by the company’s algorithms – Hindi, Bengali, Urdu and Tamil.

Most importantly, given the inflammatory content put out by pro-RSS users, pages and groups, the complaint hints that Facebook has struggled to categorise or propose a designation for the group due to “political considerations”.

 

This issue has been underscored in past reporting, most notably the Wall Street Journal’s expose on how Facebook internally debated whether classifying the Bajrang Dal as a “dangerous organisation” could risk sparking physical attacks against the company’s workers or facilities in India.

Bajrang Dal activists burn tyres and shout slogans. Photo: PTI/Files

Reccos, shares and misinformation

One recurring theme in the complaints is how social media allows “misinformation to gain unobstructed virality”.

How prevalent is this in India? One internal study cited in a complaint says that there are “1 to 1.5 million predicted misinfo VPVs [view port views, or impressions] per hour in India, Indonesia and Philippines in peak hours”.

A key component of this is what Facebook refers to as ‘deep reshares’ – posts that are re-shared many times and are therefore more likely to contain sensational, harmful or divisive content.

The concept is simple, with Facebook’s own analysis reaffirming what most would agree to be common sense: one internal company document cited in a complaint on Facebook’s algorithm claims that the ‘re-share depth’ or ‘the number of shares in a chain’ is strongly correlated with misinformation.

Another study cited claims that “reshare depth” is a “particularly good signal” for targeting “link misinfo” in India and the Philippines.

As an example, the complaint relays what appears to be the results of a sampling survey done by Facebook in West Bengal. Again, this document is undated, so it is not clear when it was carried out.

However, it notes that “40% of Top VPV civic posters in West Bengal were inauthentic/fake” and that the “highest VPV user to be assessed as inauthentic had more than 30M accrued” in the last 28 days.

“Coordinated authentic actors seed and spread civic content to propagate political narratives…the message comes to dominate the ecosystem with over 35% of members having been recommended a cell group by our algorithms,” the document notes.

Credit: Whistleblower complaint to SEC, CBS News.

Photo: Whistleblower complaint to SEC, CBS News.

BJP and duplicate accounts?

In the complaint on Facebook’s reach and advertising, Haugen’s lawyers tackle the problem of ‘single user, multiple accounts’ or SUMA. More commonly referred to as ‘duplicate’ accounts the problem has been touchy for years with Facebook.

The principal debate that has taken place is whether the company is doing enough to crack down on SUMAs and that if it does, whether such a step would drastically reduce engagement on the platform.

Haugen’s assessment during her time at the company is that the company is doing little to regulate these duplicate accounts.

In one section on this, an internal company document called ‘Lotus Mahal’ is cited to claim that Facebook officials are well aware of how the BJP IT cell uses SUMAs to propagate narratives.

Credit: Whistleblower complaint to SEC, CBS News.
 

Photo: Whistleblower complaint to SEC, CBS News.

Haugen argues that Facebook’s willingness to allow political parties to use duplicate accounts exposes it to brand liability with regard to advertisers.

 

Headlines

Priyanka Gandhi

OMAR ABDULLAH:

YouTubeBox _A

NRI News:

Currency Rates

S5 Instagram Feed

YouTubeBox _K

World COVID-19

Poll:

Who will win 2024 General Election in India?