Facebook’s handling of controversial political posts and advertisements has left the social media company facing scathing criticism from those in power and the Opposition
A Wall Street Journal report that a top Facebook official turned a blind eye to hate speech by a BJP leader and three other “Hindu nationalist individuals and groups” to avoid damaging the social media platform’s business prospects, has predictably caused a furore in India.
Congress leader Rahul Gandi has alleged that the BJP and the RSS control Facebook and Whatsapp in the country and Shashi Tharoor, chairperson of the committee on Information Technology, has said that the committee “would certainly wish to hear” from Facebook.
What’s clear is that Facebook will, in the comings months, face increased scrutiny in India, for what it does and, more importantly, what it doesn’t do.
But this isn’t the first time that the social media company has found itself in hot water. Be it India, United States, Sri Lanka or the Philippines, Facebook’s handling of controversial political posts and advertisements has left the social media company facing scathing criticism, seen sponsors leave in droves, fomented internal revolts and increased scrutiny from regulators.
As per this Washington Post piece, the genesis of Facebook’s approach to controversial political speech came in 2015, when company executives declined to remove a post of then US presidential candidate Donald Trump calling for a ban on Muslims entering the country.
Mounting outrage caused the company to call a meet in which employees slammed the post as hate speech. In the meetings that followed, Facebook founder and CEO Mark Zuckerberg professed to be ‘disgusted’ by it and wanted it removed, as the the report.
The company, after considering a slew of options, ultimately decided to carve out an exception for political speech which would take into account “newsworthy political discourse” when deciding if post violated community guidelines, as per the report.
But it is Zuckerberg’s insistence that his company ‘will not be an arbiter of truth’ and its stated policy of not fact-checking political advertisements in the United States and elsewhere that has been the cause of much outrage.
In October 2019, Facebook refused to take down a video advert from Trump’s reelection campaign that accused former vice-president Joe Biden, without any evidence, of corruption in his role in the Obama administration’s Ukraine policy.
That same month, in the midst of a heated battle for the Democratic nomination for president, Senator Elizabeth Warren, to prove a point, ran ads on Facebook that falsely claimed that Zuckerburg had endorsed Trump’s reelection for president.
And after Green New Deal champion Alexandria Ocasio-Cortez grilled Facebook CEO Mark Zuckerberg on whether his site would allow such ads that falsely claimed Republicans had endorsed the Green New Deal, a left-leaning political group did exactly that, targetting Trump ally and staunch Republican Lindsey Graham.
Facebook, after mounting pressure to avoid a repeat of the 2016 election where it is thought to have had an impact in spreading fake news and influencing voters, has recently begun pushing back, although in fits and starts.
On 6 August, Facebook for the first time took down a Trump post claiming that children are “immune” to the coronavirus. After Trump supporters put out doctored videos of House Speaker Nancy Pelosi, Facebook declined to take them down, although it labelled one clip ‘partly false’.
In July, Facebook refused to take down a post which claimed to show violence in the United States but was actually from a 2014 pro-democracy protest in Ukraine.
In May, when Trump warned those protesting the death of George Floyd by saying “when the looting starts, the shooting starts“, a phrase from the civil rights era with racist connotations, Zuckerberg explained why Facebook would not take down his post even as top company executives dialled the White House to ask Trump to tone down his language.
While this is at least a start, critics are far from satisfied. NBC‘s report last year that Trump and Zuckerberg had dined together at the White House caused many eyebrows to be raised.
“I believe they have a deal,” said Roger McNamee, an early Facebook investor told The New York Times. He added that it was “probably implied rather than explicit.”
Zuckerberg recently denied any such claim. “I’ve heard this speculation, too, so let me be clear: There’s no deal of any kind,” Zuckerberg told Axios. “Actually, the whole idea of a deal is pretty ridiculous.”
But Facebook’s efforts pale in comparison to Twitter which in May began fact-checking Trump’s claims that mail-in voting would be “substantially fraudulent” and labelling them as unsubstantiated and even hid the Trump’s Minnesota tweet for “glorifying violence.”
Facebook has also been in the eye of the storm elsewhere in the world for its reticence to act on controversial political posts and curb the spread of misinformation, most notably Sri Lanka and The Philippines.
Ahead of the 2019 presidential election in Sri Lanka, Facebook came under fire from civil society groups for refusing to, what else, fact check politicians in advertisements. The Guardian reported that an official Facebook page affiliated with Gotabaya Rajapaksa, then the candidate of the Sri Lanka People’s Front and now prime minister, promoted a post featuring misinformation that had been deemed false by AFP Sri Lanka.
As per the report, some Sinhala language posts used photographs of Buddhist statues lying on the ground to suggest that “Muslim extremists” had razed a Sri Lankan heritage site after which AFP confirmed with the temple’s chief monk that no such attack had occurred. The post was not removed.
The dearth of staff fluent in Sinhala — the language spoken by Sri Lanka’s largest ethnic group — compounded the issue, with government officials and activists saying the oversight allowed extremist content to flourish undetected on the platform. Ahuja said Facebook was committed to hiring more Sinhala speakers but declined to say how many were currently employed in Sri Lanka.
In The Philippines, where Facebook is ubiquitous (an estimated 97 percent of the populace with access to the internet have accounts) and smartphones outnumber people, President Rodrigo Duterte has weaponised the platform to slander his opponents and push his deadly war on drugs.
Duterte’s supporters have propagated several false narratives on his behalf, including a fake endorsement from the Pope, no less. His supporters also targetted a prominent senator and human rights activist who was investigating his extrajudicial killings and and claimed the country’s chief justice tried to flee the country to escape the impeachment complaint.
That Senator, Leila De Lima, found herself later arrested and imprisoned on drug charges after Duterte’s supporters tried to falsely implicate her in a sex scandal.
“Until we find an effective way to counter” the misinformation problem in the Philippines, Leila de Lima wrote to BuzzFeed News from Camp Crame, where she is imprisoned, “we cannot hope to repair the damage [it’s] already caused and to ensure it can never hijack our democratic way of life again.”
Facebook, on its part, has chalked up an aspect of the problem in The Philippines to the country’s unfamiliarity with the internet and has only deepened its ties with the Duterte regime.
But with the 2020 US election drawing closer, the spotlight on the social media company is only likely to get brighter.
With inputs from agencies
Find latest and upcoming tech gadgets online on Tech2 Gadgets. Get technology news, gadgets reviews & ratings. Popular gadgets including laptop, tablet and mobile specifications, features, prices, comparison.
Merchant Export Marketplace
Best Export Training center