Inside Facebook’s election ‘war room’
The War Room’s half-finished state shows how nascent and hurried many of Facebook efforts are to curb fake news in the upcoming US mid-term elections
Menlo Park (California): Sandwiched between Building 20 and Building 21 in the heart of Facebook’s campus, an approximately 25-foot-by-35-foot conference room is under construction.
Thick cords of blue wiring hang from the ceiling, ready to be attached to window-size computer monitors on 16 desks. On one wall, a half dozen televisions will be tuned to CNN, MSNBC, Fox News and other major networks. A small paper sign with orange lettering taped to the glass door describes what’s being built: “War Room”.
Although it is not much to look at now, as of next week the space will be Facebook’s headquarters for safeguarding elections. More than 300 people across the company are working on the initiative, but the War Room will house a team of about 20 focused on rooting out disinformation, monitoring false news and deleting fake accounts that may be trying to influence voters before elections in the United States, Brazil and other countries.
“We see this as probably the biggest company-wide reorientation since our shift from desktops to mobile phones,” said Samidh Chakrabarti, who leads Facebook’s elections and civic engagement team. The company, he added, “has mobilized to make this happen”.
The misuse of Facebook by foreign influence campaigns has been rampant. In July and August, the company detailed previously undisclosed efforts by Iranians and Russians to mislead users of the social network through divisive ads and posts. Now, with the mid-term elections in the United States seven weeks away, Facebook is in an all-out sprint to convince the world it is ready to handle any new attempts at such meddling. The company is under tremendous pressure to prevent a repeat of the foreign manipulation that unfolded on the social network during the 2016 presidential campaign.
Mark Zuckerberg, Facebook’s chief executive, has vowed to fix the problems, and he said this month that the company was “better prepared” to handle potential interference. But he has acknowledged that Facebook was in an “arms race” against those who were trying to manipulate the platform. The company has taken steps to build defences against spammers, hackers and foreign operatives—including hiring thousands of people to help moderate content and starting an archive to catalog all political ads—but the War Room’s half-finished state shows how nascent and hurried many of the efforts are.
Already, foreign operatives have evolved their online influence campaigns to skirt the measures Facebook has put in place ahead of the US mid-term elections, said Priscilla Moriuchi, director of strategic threat development at the cybersecurity firm Recorded Future.
“If you look at the way that foreign influence operations have changed these last two years, their focus isn’t really on propagating fake news anymore,” Moriuchi said. “It’s on augmenting stories already out there which speak to hyperpartisan audiences.” She added, “It’s going to be harder for companies like Facebook and Twitter, from my perspective, to find them now.”
Facebook invited two New York Times reporters into the War Room before it opens next week to discuss the work of the elections team and some of the tools it has developed to try to prevent interference. The company limited the scope of what The Times could see and publish out of a concern about revealing too much to adversaries who may be looking for vulnerabilities. The company said the War Room was modelled after operations used by political campaigns, which are typically set up in the final weeks before election day.
The War Room is a “proactive” way to build systems in anticipation of attacks, Greg Marra, a product manager working on Facebook’s News Feed, said in a conference call with reporters Wednesday.
One of the tools the company is introducing is custom software that helps track information flowing across the social network in real time, said Chakrabarti, who joined Facebook about four years ago from Google.
These dashboards resemble a set of line and bar graphs with statistics that provide a view into how activity on the platform is changing. They allow employees to zero in on, say, a specific false news story in wide circulation or a spike in automated accounts being created in a particular geographic area.
The dashboards were first tested before of the special US Senate election in Alabama in December. Without specifying the dashboards, Zuckerberg has said a new tool helped Facebook identify political interference more quickly in that election.
Since then, Facebook has tested and redesigned the software during multiple elections worldwide. This month, before Brazil’s presidential election, the company will introduce the newest versions of the dashboards, Chakrabarti said.
Facebook created what it calls its elections and civic engagement team in 2007 to work with governments and campaigns on how they could use the social network most effectively. For a long time, the team numbered just a few dozen people in Silicon Valley; it expanded in 2013 to include members in other offices outside of the United States.
After Facebook disclosed that agents linked to the Kremlin had manipulated the social network to spread inflammatory messages to American voters during the 2016 election, the company began to increase the team’s ranks. The group was also restructured to focus more on the security of elections.
Since then, the team has mushroomed to its current size, augmented by other people at the company whose jobs involve some aspect of stopping election interference. Facebook has said each of its unit— including Instagram and WhatsApp—has been told to make election security a top priority when designing products.
Chakrabarti meets several times a month with Facebook’s top executives, engineers and product managers. The meetings often include Zuckerberg and Sheryl Sandberg, the chief operating officer.
Facebook decided this year to create a War Room so a core group of engineers, data scientists and executives could sit together in the same space before the mid-terms. They chose an empty conference room off the hallway that connects Building 20 and Building 21, a central point on Facebook’s campus that is easy for employees to get to.
Construction began a few months ago and the room, with its whiteboard walls and clusters of long tables, is set to open for operations Monday. It has been refitted with cables and internet boosters, and new wiring was installed for the monitors and other equipment.
What happens in the War Room will be a “last line of defence” for Facebook engineers to quickly spot unforeseen problems on and near election days in different countries, Chakrabarti said. Many of the company’s other measures are meant to stop disinformation and other problems long before they show up in the War Room.
Once a problem reaches the War Room, the dashboards will be set to spot and track unusual activity, while data scientists and security experts take a closer look. Chakrabarti said the team was particularly on guard for posts that manifested “real-world harm”, and planned to remove posts that tried to disenfranchise voters by giving incorrect polling data or spreading hoaxes like encouraging people to vote by text message.
“The best outcome for us is that nothing happens in the War Room,” he said. “Everything else we are doing is defences we are putting down to stop this in the first place.”
Editor's Picks »
- How realty churn is affecting NCR, Mumbai builders
- Apollo Hospitals: A $2 billion health empire run by four sisters makes a comeback
- The fall of Carlos Ghosn
- IndiGo offers flight tickets from Rs 899 in new sale
- BJP faces anti-incumbency, internal dissent, threat of desertion from core voters in Madhya Pradesh