YouTube Unleashed a Conspiracy Theory Boom. Can It Be Contained?

Mr. Chaslot noted that this algorithm — which was once trained to maximize the amount of time users spend on the site — often targeted vulnerable users by steering them toward other conspiracy theory videos it predicts they will watch.

The change “will save thousands from falling into such rabbit holes,” he wrote.

In an interview last week, Mr. Chaslot was more circumspect, saying YouTube’s move may have amounted to a “P.R. stunt.” Because the change will affect only which videos YouTube recommends — conspiracy theories will still show up in search results, and they will still be freely available to people who subscribe to the channels of popular conspiracy theorists — he called it a positive but insufficient step.

“It will address only a tiny fraction of conspiracy theories,” he said.

Last year, Mr. Chaslot built a website, AlgoTransparency.org, to give outsiders a glimpse of YouTube’s recommendation algorithms at work. The site draws from a list of more than 1,000 popular YouTube channels, and calculates which videos are most often recommended to people who watch those channels’ videos.

On many days, conspiracy theories and viral hoaxes top the list. One recent day, the most frequently recommended video was “This Man Saw Something at Area 51 That Left Him Totally Speechless!,” which was recommended to viewers of 138 channels. The second most recommended video, which linked a series of recent natural disasters to apocalyptic prophecies from the Book of Revelation, was recommended to viewers of 126 of those top channels.

In our conversation, Mr. Chaslot suggested one possible solution to YouTube’s misinformation epidemic: new regulation.

Lawmakers, he said, could amend Section 230 of the Communications Decency Act — the law that prevents platforms like YouTube, Facebook and Twitter from being held legally liable for content posted by their users. The law now shields internet platforms from liability for all user-generated content they host, as well as the algorithmic recommendations they make. A revised law could cover only the content and leave platforms on the hook for their recommendations.

“Right now, they just don’t have incentive to do the right thing,” Mr. Chaslot said. “But if you pass legislation that says that after recommending something 1,000 times, the platform is liable for this content, I guarantee the problem will be solved very fast.”