Facebook Moves to Stop Election Misinformation

Thursday’s changes, which are a tacit acknowledgment by Facebook of how powerful its effect on public discourse can be, are unlikely to satisfy its critics. Some of its measures, such as the blocking of new political ads a week before Election Day, are temporary. Yet they demonstrate that Facebook has sweeping abilities to shut down untruthful ads should it choose to do so.

Facebook said it would begin barring politicians from placing new ads on Facebook and Instagram, the photo-sharing service it owns, starting on Oct. 27. Existing political ads will not be affected. Political candidates will still be able to adjust both the groups of people their existing ads are targeting and the amount of money they spend on those ads. They can resume running new political ads after Election Day, the company said.

In another change, Facebook said it would place what it calls its voting information center — a hub for finding accurate, up-to-date information on how to register to vote, and when and where to do so — at the top of its News Feed, which is seen daily by millions, through Election Day. The company had rolled out the voter information center in June and has continued promoting it to users, with a goal of registering four million people and encouraging them to vote.

To curb misinformation about voting, Facebook said it would remove posts that tell people they will catch Covid-19 if they take part in voting. For posts that use the coronavirus to discourage people from voting in other, less obvious ways, the company said it would attach a label and link to its voter information center.

Facebook also plans to remove posts that both explicitly and implicitly aim to disenfranchise or prohibit people from voting; previously, the company removed only posts that actively discouraged people from voting. Now, a post that causes confusion around who is eligible to vote or some part of the voting process — such as a misstatement about what documentation is needed to receive a ballot — would also be removed.

The company also said it would limit the number of people that users can forward messages to in its Messenger app to no more than five people, down from more than 150 people previously. The move mirrors what WhatsApp, the messaging app also owned by Facebook, did in 2018 when it limited message forwarding to 20 people from a previous maximum of 250.

Misinformation across private communication channels is a much more difficult problem to tackle than on public social networks because it is hidden. Limiting message forwarding could slow that spread.