YouTube follows Twitter and Facebook with QAnon crackdown

YouTube is following the lead of Twitter and Facebook, saying that it is taking more steps to limit QAnon and other baseless conspiracy theories that can lead to real-world violence

The Google-owned video platform said Thursday it will now prohibit material targeting a person or group with conspiracy theories that have been used to justify violence.

Pizzagate is another internet conspiracy theory — essentially a predecessor to QAnon — that would fall in the banned category. Its promoters claimed children were being harmed at a pizza restaurant in Washington. D.C. A man who believed in the conspiracy entered the restaurant in December 2016 and fired an assault rifle. He was sentenced to prison in 2017.

YouTube is the third of the major social platforms to announce policies intended rein in QAnon, a conspiracy theory they all helped spread.

Facebook, meanwhile, announced last week that it was banning groups that openly support QAnon. It said it would remove pages, groups and Instagram accounts for representing QAnon — even if they don’t promote violence.

Facebook’s move came two months after it announced softer crackdown , saying said it would stop promoting the group and its adherents. But that effort faltered due to spotty enforcement.

YouTube said it had already removed tens of thousands of QAnon-videos and eliminated hundreds of channels under its existing policies — especially those that explicitly threaten violence or deny the existence of major violent events.

“All of this work has been pivotal in curbing the reach of harmful conspiracies, but there’s even more we can do to address certain conspiracy theories that are used to justify real-world violence, like QAnon,” the company said in Thursday’s blog post.