Stopping Online Vitriol at the Roots

This article is part of the On Tech newsletter. You can sign up here to receive it weekdays.

America, it’s one day before a pivotal election, and we’re awash in a lot of garbage information and online vitriol. It comes from strangers on the internet, scammers in our text messages, disreputable news organizations and even our friends and family.

Whitney Phillips, an assistant professor in the department of communication and rhetorical studies at Syracuse University and an author on polluted information, says that all of this is making our brains go haywire.

With the U.S. election ginning up misleading information and the nonstop political discussions online wearing many of us out, I spoke to her about how we can individually and collectively fight back. Here are edited excerpts from our discussion:

You’ve written that angry conversations online and misleading information essentially short circuits our brains. How?

When our brains are overloaded, and we’re confronted constantly with upsetting or confusing information, it sends us into a state in which we’re less capable of processing information. We say things we probably shouldn’t. People get retweet happy. It’s not productive, even when people have good intentions and think they’re helping.

How do we stop that process?

I’ve been researching how mindfulness meditation processes can help us navigate this information hellscape. When you see or read something that triggers that emotional reaction, take a moment to breathe and try to establish some emotional space. It doesn’t mean you shouldn’t say the critical thing you’re thinking, but you should first reflect on the most constructive thing to do next.

But we don’t tend to think that we’re the ones acting irresponsibly or irrationally. We think the people who disagree with us are irrational and irresponsible.

Most people think if they’re not setting out to do damage or don’t have hate in their hearts, then they don’t have to consider what they do. But even if we aren’t vicious ourselves, we’re still fundamentally a part of what information spreads and how.

We all affect the ecology around us. Bad actors like high-profile influencers can scar the land, but everyone else does, too. The more information pollution there is in the landscape, the less functional our democracy is. If you feel that everything is terrible and everyone lies, then people don’t want to engage in civic discourse.

This imposes a lot of personal responsibility on a problem that is much bigger than us as individuals.

Yes, individual solutions are not enough. We all can make better choices, but that means nothing if we’re not also thinking about structural, systemic reasons that we’re forced to confront bad information in the first place.

What are those structural forces? What can be done to make the information environment better at the structural level?

For us to understand how bad information travels we have to think about all the forces that contributed to it — decisions made by the internet platforms, broader capitalist forces, local and national influences. And it includes you. All of them feed into each other.

Part of the problem is that people haven’t understood how information works or recommendation algorithms of social media companies that influence why we see what we do online. If people understand, they can imagine a different world and they can fight to change the system.

I’m tempted to unplug the internet and go live in a cave. Should I?

We need to find a way to balance between evacuating from the hurricane and running toward the hurricane. If we only evacuate, we’re not doing our part as citizens, and we force people on the informational front lines to bear that burden. If we only run toward the storm, we’ll burn out.

SEND US YOUR QUESTIONS: We want to hear your election tech questions. What are you curious or concerned about related to how tech companies are handling election-related misinformation, or how secure America’s election technology is?

Send your questions to ontech@nytimes.com, and we’ll answer a selection. Please include your full name and location.


As Whitney Phillips said, all of us have a role to play in reducing the spread of garbage information. Brian X. Chen, a consumer technology columnist for The New York Times, talks us through how to avoid the particularly nefarious false or misleading meme:

Misinformation comes in many forms, but one big culprit to look out for this election season is the meme, which is typically a photo or screenshot with text superimposed over it.

Memes are dangerous because it takes only a few seconds for someone to create one and share it on social media. And it’s easy for images to be doctored and for quotes to be ripped out of context.

So think twice before you re-share a meme — and when in doubt, check the source. A quick way to do that is by looking at the origins of an image by using the reverse image search tool on Google.

Here’s how: On Google.com, click Images in the right hand corner of the page and upload the photo or paste the web address of the photo into the search bar. That will show where else the image has appeared on the web. This can help you verify whether the one you have seen has been manipulated.

Beyond that, keep these three questions in mind when you’re scrolling through news articles and social media posts related to the election:

Mindfulness will be one of our best weapons against the spread of misinformation this election.

“The No. 1 rule is to slow down, pause and ask yourself, ‘Am I sure enough about this that I should share it?’” Peter Adams, a senior vice president at the News Literacy Project, a media education nonprofit, told me. “If everybody did that, we’d see a dramatic reduction of misinformation online.”


  • Holding back the misinformation tide: My colleagues write about what Facebook, Twitter and YouTube are doing during and after Election Day — like banning political ads or blaring that no presidential winner had been chosen until results are verified — to clamp down on election-related falsehoods and highlight accurate and helpful information.

  • Counterpoint: Fears about misinformation are overblown: Slate writes that while false or misleading information may entrench existing political and social divides, there’s not much evidence that it sways voters’ attitudes or behaviors. Instead, it states that people’s beliefs and choices at the polls are largely shaped by their social identities around race and class, gender, geographic location and religion.

  • Career development in 60-second video bites: Career coaches are offering advice about writing résumés, finding a job and more on TikTok. It’s a low-cost, accessible alternative to what’s often expensive consulting services, Yasemin Craggs Mersinoglu writes for The Times.

Newborns in the intensive care unit at the Children’s Hospital of Philadelphia were dressed up in Halloween costumes as Apollo Creed from the “Rocky” movies, a Subway sandwich and a little pig in a blanket. The hospital said it organized this to give the families a moment of normalcy.


We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at ontech@nytimes.com.

If you don’t already get this newsletter in your inbox, please sign up here.