YouTube on Wednesday announced changes to how it handles videos about the 2020 presidential election, saying it would remove new videos that mislead people by claiming that widespread fraud or errors influenced the outcome of the election.
The company said it was making the change because Tuesday was the so-called safe harbor deadline — the date by which all state-level election challenges, such as recounts and audits, are supposed to be completed. YouTube said that enough states have certified their election results to determine that Joseph R. Biden Jr. is the president-elect.
YouTube’s announcement is a reversal of a much-criticized company policy on election videos. Throughout the election cycle, YouTube, which is owned by Google, has allowed videos spreading false claims of widespread election fraud under a policy that permits videos that comment on the outcome of an election. Under the new policy, videos about the election uploaded before the safe harbor deadline would remain on the platform, with YouTube appending an information panel linking to the Office of the Federal Register’s election results certification notice.
In a blog post on Wednesday, YouTube pushed back on the idea that it had allowed harmful and misleading elections-related videos to spread unfettered on its site. The company said that since September, it had shut down over 8,000 channels and “thousands” of election videos that violated its policies. Since Election Day, the company said, it had also shown fact-check panels over 200,000 times above relevant election-related search results on voter fraud narratives such as “Dominion voting machines” and “Michigan recount.”