© 2024
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
An update has been released for the Android version of the WAMC App that addresses performance issues. Please check the Google Play Store to download and update to the latest version.

Facebook, Exploited By Influence Campaigns, Tries To Clamp Down With 'War Room'

Lexi Sturdy, election war room lead, sits at her desk during a demonstration in the war room, where Facebook monitors election-related content on the platform, in Menlo Park, Calif., on Oct. 17.
Jeff Chiu
/
AP
Lexi Sturdy, election war room lead, sits at her desk during a demonstration in the war room, where Facebook monitors election-related content on the platform, in Menlo Park, Calif., on Oct. 17.

With midterm elections just two weeks away, Facebook says it is ramping up its operations to fight disinformation.

The social media behemoth has established a "war room" at its headquarters in Menlo Park, Calif., where specialists try to detect and disrupt bad actors attempting to delegitimize elections, spread fake information and suppress the vote.

"One of the strengths of having everybody in the same place together is just the speed with which we were able to act between detecting a problem in the first place and eventually taking action," said Samidh Chakrabarti, head of the civic engagement team at Facebook.

"We can go from detecting things like spikes in hate speech or voter suppression activity that may be spreading on the platform ... to action in just a couple of hours," said Chakrabarti, whose sister, Meghna Chakrabarti, is the host of WBUR's On Point. Facebook designated him as a presenter for journalists invited to visit the war room.

It's on the fourth floor of a main building on the Facebook campus, in which approximately 20 stations are watched by engineers, data scientists and operations specialists. A large American flag hangs above custom-built dashboards that monitor viral news, spam and voter suppression efforts.

"These dashboards actually have alarms on them so that if there's any spike in unusual activity, the war room is alerted to them and then our data scientists will be able to look at any sort of anomalies that are detected, figure out if there are actually problems underneath those anomalies that we see — and if there are, pass them along to our operations team," said Chakrabarti.

Fox News, CNN and a rolling Twitter feed play on large screens along the wall — other screens show graphs indicating anomalies in Facebook's networks that suggest there might be a problem with a surge in false election information.

Still other televisions show a direct video conference feed with Facebook's elections teams in Washington, D.C., and Austin, Texas.

The wall is also covered with various brightly colored motivational posters: "The Best Way To Complain Is To Make Things," reads one. "Be the Nerd," is another.

Much of one wall is covered with a large map of Brazil, which recently held the first stage of its presidential elections; Facebook's specialists were watching for influence operations there as well.

The midterms

Even more attention is being paid to the elections that will soon be occurring in the United States: In 2016, the social media network was caught flat-footed as foreign actors spread false narratives by masquerading as Americans.

The war room is meant to be a critical part of Facebook's plan to prevent a repeat — and public officials are watching closely.

Senate intelligence committee Vice Chairman Mark Warner, D-Va., says the big social media platforms need to do more to clamp down on disinformation associated with foreign influence campaigns.
J. Scott Applewhite / AP
/
AP
Senate intelligence committee Vice Chairman Mark Warner, D-Va., says the big social media platforms need to do more to clamp down on disinformation associated with foreign influence campaigns.

"Facebook was asleep at the switch [during the 2016 campaign]. And even in the immediate aftermath of the elections they denied that there were any foreign influencers on their platform. They were dead wrong," Sen. Mark Warner, the top Democrat on the Senate intelligence committee, told NPR.

These concerns reflect the centrality of Facebook in American civic life.

Some 43 percent of Americans get news on Facebook, by far the website Americans most commonly use for news, according to a recent survey by Pew Research.

The social media network has been clear in saying it does not want to police content, generally speaking, but Facebook officials think that cracking down on inauthentic behavior — such as fake accounts and spamming — will help curb foreign influence operations.

"We have actually made huge advances in artificial intelligence and machine learning. And we have been able to block, in a recent six-month period, 1.3 billion fake accounts from forming," Chakrabarti said.

But Facebook faces a formidable challenge.

Kevin Mandia, the CEO of FireEye, a cybersecurity firm that counts Facebook as a client, recently told NPR that there are many foreign operations that analysts have yet to discover.

"I strongly doubt that we've caught a hundred percent of the ones Iran's doing or Russia's doing. And I would say we're more on the 3 to 5 percent on what we've found ... meaning there's a ton of it going on right now that we're wholly unaware of," he said.

Facebook told NPR that it has seen foreign information operations increasing as the midterms near, which is what it expected.

"Information operations, what we're talking about here, is a security challenge, which means that you have sophisticated adversaries that are continuously trying to find new ways to cause harm and manipulate your platform," said Nathaniel Gleicher, Facebook's head of cybersecurity policy.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Tim Mak is NPR's Washington Investigative Correspondent, focused on political enterprise journalism.