How to be a voice of reason in an age of misinformation

February 9, 2022

You’ve heard the myth, in one form or another: Clean energy is unreliable. At this time last year, Texas Gov. Greg Abbott and others hawked a version of that theme when winter storms plunged the state into deadly, widespread power outages. The problem, you’d hear if you were watching Fox News, was “a reckless reliance on windmills.”

Explainer after explainer debunked the false message that frozen wind turbines drove the Texas grid’s disastrous failure—several factors, including lost generation from natural gas and lack of winterized equipment, played a role. As the state faced more icy weather this month, Abbott shied away from bashing wind again—but he’d already provided fodder for others interested in undermining renewable energy momentum.

Even a modest amount of complexity seems to invite the spread of false information. The U.S. electric grid, and the infrastructure that serves it, is run by an alphabet soup of decision-makers, from the local level on up: IOUs and POUsRTOs and ISOs, PUCs, FERC… don’t get us started! 

“Fighting Evil Robots Center” does sound familiar though…

So it’s not surprising to see people gravitate toward easy answers when a crisis hits, even if those answers are wrong. Climate-related issues can be scary, and there are plenty of unknowns, so it’s not hard to find all kinds of myths about climate change

But on another level, it’s simple: We know what’s causing climate change, and we know what to do about it. How do we avert fear-based falsehoods and stick to the facts? The answer isn’t simply to push our glasses up on our noses and provide a correction.

“Fact and ‘alternative fact’ are like matter and antimatter. When they collide, there’s a burst of heat followed by nothing,” writes communications researcher John Cook, adding that people will just lose faith in facts. “Fittingly, science holds the answer to science denial.”

Here’s a look at the science on why misinformation is so rampant these days, as well as what tools we have to deal with it. 

Why is this happening?

Brendan Nyhan, a political scientist at Dartmouth College, has outlined three reasons why truth seems to be going off the rails, and they aren’t likely to surprise you. One, misinformation thrives in a polarized society where people feel the need to “ingroup,” or identify with a particular tribe. “Greater partisan divisions in social identity,” Nyhan has written, “have seemingly increased the political system’s vulnerability to partisan misinformation.”

Two, this desire for identity-affirming messages attracts political and media personalities who will say what they know polarized people want to hear, regardless of whether it is true. And finally, social media amplifies and reinforces false beliefs, rewarding the purveyors with approval and/or attention.


Check out the latest episode from the podcast


The waters become even murkier when you consider how much vested economic interest there is in the status quo. Attempts to downplay, or outright deny, the negatives of fossil fuel dependence have been happening since way before Facebook was even a twinkle in Mark Zuckerberg’s eye.

Another contributor to misinformation? The Dunning-Kruger effect, which describes how people get way overconfident about what they know. “The first rule of the Dunning-Kruger club is you don’t know you’re a member of the Dunning-Kruger club,” says psychology professor David Dunning in this entertaining Q&A about his work.

What can we do about it?

This may be the least appealing advice of the lot, but when it comes to countering misinformation, we would do well to start by looking at ourselves. “Think about what you don’t know. That is, check your assumptions,” Dunning counsels. “Be a little bit more careful about what pops out of your head or what pops out of your mouth.” 

He points to the “superforecasters” chronicled by University of Pennsylvania psychologist Philip Tetlock. “The people with the best judgment,” Tetlock has found, are “gathering evidence from a variety of sources, thinking probabilistically, working in teams, keeping score, and being willing to admit error and change course.” Public accountability, he adds, also boosts forecasting performance: Even the most opinionated types will become more cautious if they know their accuracy will be scrutinized against others.

We all possess this ability to scrutinize our own assumptions, and this can keep people from spreading bad info. A 2021 paper published in the journal Nature found that people generally don’t think about accuracy when deciding whether to share an article online, but that “subtly shifting attention to accuracy increases the quality of news that people subsequently share.”

Too many times, though, an inaccurate or misleading narrative does get shared—a lot. In those cases, it’s better to “redirect” a myth than to argue with it head on, according to a 2018 study from Princeton University. That research showed people can be swayed by hearing repeated, related truths that effectively replace the previous bad information. Their example: “If a policymaker wants people to forget the inaccurate belief that ‘Reading in dim light can damage children’s eyes,’ they could instead repeatedly say, ‘Children who spend less time outdoors are at greater risk to develop nearsightedness.'” (You could argue the news these days basically refutes the idea that “global warming is a hoax” with headline after headline about the “related truths” of wildfires, 100-year weather events, floods, etc.)

But this tactic only holds true when people are on the fence about an idea. A false belief, once it is deeply held, is difficult to dislodge. At a certain point, combating misinformation becomes less about changing minds and more about speaking up. As discussed in our earlier post on six ideas for talking about climate change, standing up for facts is about two things: providing solid sources for anyone who is actively evaluating information and affirming that false narratives should not go unchallenged.

This isn’t exactly what we meant by “challenging false narratives.”

Another idea to prevent misinformation from taking hold, inoculation theory, suggests that we can use aspects of fake stories to “prebunk” them. Researcher Cook explains, “Inoculating text requires two elements. First, it includes an explicit warning about the danger of being misled by misinformation. Second, you need to provide counterarguments explaining the flaws in that misinformation.” Cook and his co-authors describe this approach in the Debunking Handbook 2020—the idea is essentially to forewarn people about common misinformation tactics, such as trotting out fake experts to endorse a false notion. (Cook also provides a handy chart showing how to neatly refute 50 common climate myths, in part by pointing to the logical flaws in them.)

Finally, don’t forget the power of humor. For people who engage in “greentrolling,” that means relentlessly calling out tone-deaf corporate campaigns on social media, and laughing in the process. “Climate action often looks depressing and sad, but this is fun!” said “Godmother of Greentrolling” Mary Heglar in The Washington Post.

Humor is also a way to open up truths that often feel intimidating, notes Esteban Gast, who hosts our new podcast that you should definitely check out, Comedians Conquering Climate Change. Humor can invite people into a conversation. And when people feel invited in, Gast says, they are inspired to learn more. “If you’re going to make change,” he says, “It is going to be easier with a little bounce in your step. With like a little small smile on your face. That is an easier way to fight this fight.”