Too little too late?

Too little too late?
Highlights

This week saw two major pre-election initiatives from the social media giant Facebook

This week saw two major pre-election initiatives from the social media giant Facebook. On April 1, news of more than 700 Facebook pages being taken down "for coordinated inauthentic behaviour" hit the headlines.

Some 687 of those pages are supposed to be linked to the Congress IT cell while the remaining are those of an IT firm Silver Touch that works extensively for the NDA government and its various departments, including allegedly, the NAMO App.

And on April 3 we hear that WhatsApp, of Facebook stable, rolls out Checkpoint tipline with a phone number 9643000888 to fight fake news. Users are to report suspicious information to this number.

Facebook has also rolled out over the last six months fact-check initiatives to bust misinformation that is being circulated on its various platforms.

US elections that brought President Trump to power and the suspected cyber-intervention of the Russians, and the Senate hearings that followed brought into sharp focus the role being played by the social media monopolies like Facebook and Twitter.

Other reports from Brazil and Myanmar also reinforce the suspicions about the role of social media networks in shaping political processes, including elections.

In Brazil, corporations have been accused of pumping misinformation and fake news against left wing candidates that eventually led to the victory of conservative presidential candidate Bolsonaro as President. The platforms for this activity were the social media giants, especially Facebook and WhatsApp.

In what a UN report calls as the "textbook example of ethnic cleansing", the military of Myanmar flooded Facebook with hate content leading to a genocide of the Rohingyas.

Facebook woke up to the problem of this gross abuse of its platform only after the damage was already done, resulting in what the human rights groups have called "the largest forced human migration in recent history" following massacres and rapes. It took down the pages run by Myanmar military and its proxies later.

Facebook's head of cybersecurity policy, Nathaniel Gleicher, said it had found "clear and deliberate attempts to covertly spread propaganda that were directly linked to the Myanmar military."

Over the last few years India witnessed unprecedented spread of Islamophobia across the country, whether in the name love jihad, or cow slaughter resulting in horrific instances of mob lynchings and persecution.

There was also hate-mongering against Dalits and other marginalised communities and those who spoke up on their behalf. The primary platforms for this project have been WhatsApp and Facebook along with Twitter.

A simple Google search will reveal how many complaints with specific instances of targeted harassment, revealing addresses and phone numbers of people for potential attackers to find their targets easily, and death threats, were received by Facebook only to be told by its personnel that these "do not violate their community standards."

The police in various parts of India rarely respond to any complaints of vicious targeting and threats filed by well-known journalists and celebrities, even as they go after political critics of the ruling parties in various states, resulting in arrests and sedition cases in some instances.

It took a great deal of public opprobrium for Facebook to even reconsider its definition of what constitutes "community standards." It was global pressure of breaking up the monopoly and governments taking note of the unprecedented political influence wielded by the platform with its reach that is compelling it to take small initiatives to correct its course.

But how effective these course-correction measures will be is still to be seen. Take for instance the fact-check initiatives.

When a wild-fire rumour is spread through WhatsApp in real time, someone has to first spot that it is suspect information, then decide to complain. This will be verified by fact-checkers. And the fact-checkers will put out their findings.

In the case of the new tipline launched by WhatsApp too this is the expectation. If people are so alert and proactive, then societies would scarcely be victims of such technology driven abuse of the political process.

In a recent interview to the Hindu, Shivnath Thukral was asked by the paper what the reach of fact-checking sites is compared to the viral spread of fake news and disinformation.

Thukral admitted that it is very low. So how is mere fact-checking going to solve the problem if the verified information does not go back to the consumer of fake news and disinformation?

Will this 'good news' spread as successfully as the 'bad news,' considering the misinformation is produced to a plan on industrial scale by fake news factories that enjoy heavy investment and political patronage?

Facebook and WhatsApp should be made to mandatorily send out fact-checked information to all their subscribers just as they send out their own notifications to all accounts.

Without this additional step of making fact-checked information mandatorily available to the victims of fake news, consumers are likely to remain in the dark. They have not sought the fake news.

It is being pumped to them for political reasons. The platforms must take the responsibility to ensure correct information reaches the victims as soon as the fact-check is done.

A fact-finding report of the UN Human Rights Commission on Rohingya genocide says, "The extent to which Facebook posts and messages have led to real-world discrimination and violence must be independently and thoroughly examined.

The mission regrets that Facebook is unable to provide country-specific data about the spread of hate speech on its platform, which is imperative to assess the adequacy of its response."

The information monopoly, combined with the power to calibrate, control information flow on its platform, and the opacity of its operations makes Facebook a formidable influence on politics at election time.

There are several instances of the platform taking down content of popular anti-establishment political commentators like Dhruv Rathee, even as it drags its feet on taking down openly violent and communal trolls, and the murderous rumours spread through that safe black hole, WhatsApp.

India has 200 million Facebook subscribers and another 230 million WhatsApp consumers.

This makes one wonder if the attempted corrective measures of Facebook and its subsidiary entities are too little too late?

Why was nothing done till now? Is this high-visibility action barely ten days before the beginning of the process for Lok Sabha elections too little too late?

Show Full Article
Download The Hans India Android App or iOS App for the Latest update on your phone.
More Stories


Top