Facebook Opens a Command Post to Thwart Election Meddling in Europe

DUBLIN — Inside a large room in Facebook’s European headquarters in Ireland’s capital, about 40 employees sit at rows of desks, many with two computer screens and a sign representing a country in the European Union.

Large screens at the front display charts and other information about trends on the social network’s services, including Instagram and the messaging app WhatsApp. In the back, muted televisions broadcast BBC and other European news stations.

The cramped space is home to Facebook’s newly opened operations center to oversee the European Union’s parliamentary election, which will be held May 23 to May 26 in 28 countries.

Modeled after the “war room” that the Silicon Valley company created before last year’s midterm elections in the United States, the people inside are tasked with washing Facebook of misinformation, fake accounts and foreign meddling that could sway European voters. A similar command post was set up in Singapore for elections in India.

Eager to show it is taking threats seriously as it faces pressure from governments across Europe to protect the integrity of the election, Facebook invited about two dozen journalists to visit its hub last week.

“We are fundamentally dealing with a security challenge,” said Nathaniel Gleicher, Facebook’s head of cybersecurity policy. “There are a set of actors that want to manipulate public debate.”

The social network has good reason to be proactive after Russians used the platform to influence American voters in the 2016 presidential election. The company has since taken down several networks of accounts linked to foreign-influence campaigns, including some targeting users in Europe.

The European election will determine who controls the European Parliament and sets the agenda of the European Union for the next five years. It will influence how the region grapples with issues like Britain’s exit from the European Union, immigration, income inequality and the rise of extremist ideologies. European leaders have warned that foreign groups will use social media to manipulate public opinion.

Facebook is becoming more aggressive in regulating content after initially trying to avoid entanglement in free speech issues. Last week, it barred the conspiracy theorist Alex Jones and several other divisive figures from its platforms.

The company is also under pressure from regulators. European leaders are considering new policies to force tech giants to rid their platforms of misinformation, hate speech and extremist content. Facebook faces several investigations related to its handling of user data.

The election center in Dublin will be open through this month’s vote. Data analysts, content moderators, engineers and attorneys from across Facebook were flown in from around the world. All 24 of the European Union’s official languages are represented.

Inside the room, employees — many in jeans, T-shirts and hoodies — appeared to be mostly in their 20s or 30s. Many seemed to be browsing news articles from the country they were overseeing.

It’s hard to know what effect their work may have beyond public relations. Facebook allowed journalists to observe for only a few minutes. Citing security concerns, the company didn’t allow employee interviews, and it limited what could be photographed.

Facebook also wouldn’t say what actions the European team had taken since the center opened last week, other than it had reviewed “hundreds” of pieces of material. The team is alerted to questionable content by an automated system that finds problematic content, or when there’s a surge in the flagging of a piece of content by users, said Lexi Sturdy, a public policy manager brought to Dublin from Facebook’s headquarters in Menlo Park, Calif., to oversee the effort.

Depending on the language or the country where the post, video or photo originated, a team member reviews the material and decides whether to recommend that it be taken down. The ultimate decision is made by an employee who decides if the content meets the company’s user guidelines. In some instances, what’s flagged will lead to a bulk takedown of posts and accounts.

But even as Facebook outlined how ready it was, the platform remains vulnerable. Researchers recently highlighted the use of WhatsApp to spread misinformation ahead of elections in Spain last week. Another persistent problem is material about news events or politics that don’t technically violate Facebook’s policies but is used by far-right and other groups to exaggerate divisions in countries.

And as Facebook clamps down on certain forms of bad behavior, like the use of fake accounts, new methods always emerge.

The goal over time, Mr. Gleicher said, is to “make the platform much more resistant to the kind of manipulation they are trying to use.”