Press "Enter" to skip to content

Ground zero: Logging into Facebook’s election war room

Days before India begins to vote in the world’s largest democratic exercise, thousands of miles away a team of experts maintain round-the-clock vigil on election-related chatter on Facebook, ready to combat the first signs of misinformation, voter manipulation tactics, poll interference and fake news.

Facebook — which counts India as one of the biggest base outside the US — has put in motion an elaborate strategy at the headquarters here with ‘information warriors’ at its election war room watching out for hoax, voter supression attempt, suspicious account behaviour, hate messages, fake accounts, and spikes in spam.

Ajit Mohan, Managing Director and Vice President, Facebook India said the platform has also expanded partnerships with third-party fact-checkers to seven accredited organisations in India.

Battle plan

“These groups cover eight of the most spoken languages — English, Hindi, Bengali, Marathi, Telugu, Tamil, Malayalam and Gujarati — and we’re looking to add more. In a country largely driven by local and community news, we knew it was critical to have fact-checking partners who could review content across regions and languages,” Mohan said in a blogpost on Monday.

Building on the lessons learnt over the last two years (the planning began nearly 18 months ago), this week Facebook will activate new regional operations centres, focused on election integrity, in Singapore and Dublin. These teams will work closely with staff in Menlo Park, CA headquarters as well as with local experts in Delhi.

Once the team, manning the election operation centres, identifies attempts of platform abuse, they move in swiftly to contain them. Last week, Facebook removed 687 pages and accounts linked to the Congress and a smaller but far more influential number of what appeared to be pro-BJP accounts for “coordinated inauthentic behaviour” on the social media platform.

The election operation centre supplements the ongoing efforts of the platform to crackdown on misinformation during the upcoming Lok Sabha elections.

Facebook has already brought in transparency initiatives for political ads, teamed up with third party fact checkers for India, and rolled out initiatives like Candidate Connect (a voter outreach feature that allows Lok Sabha candidates to record 20 second videos).

Focus

As many as 40 cross functional teams, drawn from areas like cybersecurity, public policy, data science, legal, engineering, threat intelligence, research and others work together in sync at the command centre to co-ordinate, detect and respond to situations in real time.

And soon, key officials from the US office will fly down to India and Singapore to oversee the functions, as seven phase general elections get underway. The elections will begin on April 11 and continue till May 19, with counting scheduled on May 23.

”…if a report comes that there is a piece that maybe violates our community standards…maybe people are giving wrong information on voting …we will report and a task gets created in the war room and the community operations person then evaluates that piece of content to see if its violates our community standards…minutes of taking a look at that, can take the content down,” says Katie Harbath, Facebook’s Public Policy Director for Global Elections.

There could also be instances when these efforts require close coordination with fact checkers and the response time could be a tad longer.

“You might get reports that there is violence happening at a place. There could be violence happening or it could be voter suppression tactics to make voters stay away (from an area). That is not a determination we can make, so we direct it to fact checkers to do review and that can take a bit longer. As soon as they mark it as false, the content reach will be reduced and we will provide additional context,” says Harbath briefing reporters during a media tour of the election operations centre here.

With nearly 900 million voters eligible to cast their votes in a nation that now has over 200 million Facebook users, its election integrity strategy for India is harnessing the full potential of Artificial Intelligence and Machine Learning in an attempt to defeat ‘bad actors’ who try to game the system through nefarious means.

For instance, these tools help it block or remove approximately one million accounts a day. They also help identify abusive or violating content, quickly locate it across the platform and remove it in bulk – reducing its ability to spread.

Facebook continues to expand on this initiative, adding 24 new languages — including 16 for India — to its automatic translation system, Mohan said.

Other details

“It’s not just being ready for threats we saw in 2016 or 2018, we recognise that there is always work that is to be done…so many of the investments we have made are particularly around videos…that focused on ensuring that we are ready for next set of challenges including around tamperes videos,” says Kaushik Iyer, Engineering Manager for Civic Integrity at Facebook.

While Facebook did not divulge the investments or number of employees at the election operations centre, it says the money that it has ploughed into the entire machinery – leveraged in the past for American midterms and the recent Brazilian elections – is amongst the top things in importance for the company.

“The investment in India is same size and level of importance that we have had for Brazil and the US…,” says Harbath.

Mohan said the work to ensure Facebook is not used to influence elections began more than 18 months ago with a detailed planning and risk assessment across its platforms.

“The findings allowed us to concentrate our work on key areas, including blocking and removing fake accounts; fighting the spread of misinformation; stopping abuse by domestic actors; spotting attempts at foreign meddling; and taking action against inauthentic coordinated campaigns,” Mohan said.

Source: The Hindu