What Twitter does not address is that the trolls will keep popping up. The Kremlin is unlikely to have concentrated all of its information warriors in one building. And according to researchers, up to 15 percent of all active Twitter accounts might be bots, Russian or otherwise. Twitter has a great deal more work to do on the automation front, but Whack-a-Troll is not where its disinformation problem ends.
Twitter does not seem to comprehend that non-automated content, including from American accounts, is a huge part of the disinformation ecosystem. Its amplification of neo-Nazi rhetoric, abusive content and false or misleading stories spread by accounts with huge numbers of followers all affected rhetoric surrounding the 2016 election. Other than supporting a few small media literacy programs, Twitter doesn’t seem to have a plan for eliminating the problematic content that originates within America’s own borders.
Facebook, which played its own round of Whack-a-Troll last year, scrubbing its platform of Internet Research Agency-created pages and accounts, doubled down on efforts to abdicate responsibility for defining high-quality content. On Jan. 19, Facebook announced that it will poll users to determine which outlets they trust and with which they are most familiar, giving those outlets priority in the News Feed.
These polls seem likely to reinforce existing biases, privilege well-known media giants and leave small outfits in the dust. After a year in which the United States witnessed a 37 percent decline in trust in institutions, the country’s dwindling local newsrooms are more important than ever, interpreting and contextualizing events from Congress to the City Council for their readers in a way that larger news organizations struggled to do. Facebook should place local news in a prominent location, prioritize content from high-performing smaller outlets, perhaps through a competitive application system, and make serious investments in local journalism.
Facebook will also continue to rely on users to report false stories, which it will work with third-party fact-checkers to confirm and label. It claims this process stops the spread of untruths by 80 percent. Conveniently, Facebook ignores decades of research demonstrating that those who have consumed untruths are unlikely to buy into the corrected version.
Like Twitter, Facebook is doing little to address problematic narratives that may not be patently false. After all, most of the infamous Russian election ads purchased on Facebook in 2016 were not “fake.” They were highly inflammatory messages aimed at the specific populations with whom they would most resonate. How might a concerned citizen or outside fact-checker respond to an ad comparing Hillary Clinton to Satan, or to a meme arguing that blue lives matter more than black lives?
For both platforms, and for the internet as a whole, this is admittedly a hard question. But it’s not one without an answer. As private entities, these companies have the right — and the obligation — to update their terms of service to reflect the realities of the disinformation era, defining in plain English what content is permissible on their platforms and actively enforcing those definitions. Yes, it would be costly, but you can’t put a price tag on democracy.
By narrowly focusing on eliminating Russian accounts and posts, relying on users to determine which content is trustworthy and blindly believing that fact-checking will improve the rapidly worsening level of civil discourse in America, social media companies are relinquishing their role as today’s most powerful gatekeepers of information. If these tech giants want to contribute to democracy instead of help to tear it down, they need to recognize that homegrown threats to civil discourse exist among the very users to whom they are bequeathing more responsibility. In a world where every person with a smartphone is a citizen journalist, a threat to democracy is a threat to democracy, no matter the post’s country of origin.
An earlier version of this article misstated a detail of how Facebook will now treat local news. Facebook says it will continue to appear in the News Feed; it will not be confined to a dedicated section.