But, as Ms. Tufekci and other researchers stress, such experiments are anecdotal.
Mr. Serrato wanted to get a fuller picture of how YouTube shapes perceptions of events. So he conducted something known as a network analysis, applying techniques he had used in his day job, as an analyst with Democracy Reporting International, a respected global governance monitor, to track hate speech in Myanmar.
Using YouTube’s public developer interface, Mr. Serrato plugged in a dozen recent videos related to Chemnitz. For each, he scraped YouTube’s recommendations for what to watch next. Then he did the same for those videos, and so on. Eventually, he identified a network of about 650 videos, nearly all from this year.
The results, he said, were disturbing. The network showed a tight cluster of videos that Mr. Serrato identified as predominantly conspiracy theorist or far right.
This was the first sign that YouTube’s algorithm systemically directs users toward extremist content. A more neutral algorithm would most likely produce a few distinct clusters of videos — one of mainstream news coverage, another of conspiracy theories, another of extremist groups. Those who began in one cluster would tend to stay there.
Instead, the YouTube recommendations bunched them all together, sending users through a vast, closed system composed heavily of misinformation and hate.
Viewers who come to YouTube for down-the-middle news may quickly find themselves in a world of extremists, Mr. Serrato said.
“That’s what I found bizarre,” he said. “Why are they so close together, unless the aim is to trigger a reaction?” Content that engages viewers’ emotions or curiosity, he suspected, would hook them in.