After examining more than 40 common climate change myths pushed by those who are hell-bent on discrediting scientific conclusions about the global crisis, three researchers teamed up to create a six-step critical thinking tool that helps people combat misinformation by "neutralizing" the lies.
John Cook, Peter Ellerton, and David Kinkead detailed their strategy in "Deconstructing climate misinformation to identify reasoning errors," published Tuesday by Environmental Research Letters. The researchers also released a video that demonstrates what battling climate crisis lies can look like in everyday life.
"We offer a strategy based on critical thinking methods to analyze and detect poor reasoning within denialist claims," the paper explains. "This strategy includes detailing argument structure, determining the truth of the premises, and checking for validity, hidden premises, or ambiguous language."
Step 1: Identify the claim being made. For example, the most popular contrarian argument: "Earth's climate has changed naturally in the past, so current climate change is natural."
Step 2: Construct the argument by identifying the premises leading to that conclusion. In this case, the first premise is that Earth’s climate has changed in the past through natural processes, and the second premise is that the climate is currently changing. So far, so good.
Step 3: Determine whether the argument is deductive, meaning that it starts out with a general statement and reaches a definitive conclusion. In our case, ‘current climate change is natural’ qualifies as a definitive conclusion.
Step 4: Check the argument for validity; does the conclusion follow from the premises? In our example, it doesn’t follow that current climate change must be natural because climate changed naturally in the past. However, we can fix that by weakening the conclusion to "the current climate change may not be the result of human activity." But in its weakened state, the conclusion no longer refutes human-caused global warming.
Step 4a: Identify hidden premises. By adding an extra premise to make an invalid argument valid, we can gain a deeper understanding of why the argument is flawed. In this example, the hidden assumption is "if nature caused climate change in the past, it must always be the cause of climate change." Adding this premise makes the argument logically valid, but makes it clear why the argument is false—it commits single cause fallacy, assuming that only one thing can cause climate change.
Step 5: Check to see if the argument relies on ambiguity. For example, the argument that human activity is not necessary to explain current climate change because natural and human factors can both cause climate change is ambiguous about the 'climate change' in question. Not all climate change is equal, and the rate of current change is more than 20 times faster than natural climate changes. Therefore, human activity is necessary to explain current climate change.
Step 6: If the argument hasn't yet been ruled out, determine the truth of its premises. For example, the argument that "if something was the cause in the past, it will be the cause in the future" is invalid if the effect has multiple plausible causes or mechanisms (as with climate change). In our example, this is where the myth most obviously falls apart (although it had already failed in Step 4).
The paper notes that "social media presents one potent option" for deploying their strategy, as does the classroom. Acknowledging "there is in general a dearth of misconception-based learning resources for educators"—particularly when it comes to climate education—the paper emphasizes "this research is designed to act as a building block for developing educational material that teaches critical thinking through the examination of misinformation and evaluation of arguments."
"This approach is practical, achievable, and potentially impactful in both the short-term (e.g., in social media applications) and long-term (incorporating this kind of content into curriculum)," Cook, the lead author, told the Guardian. "Misinformation needs short, sharp, immediate inoculation. Our paper provides a blueprint into how to write these inoculations."