Here are the first few paragraphs of an entry on Judith Curry’s blog. Curry is a respected scientist who is criticized by some for her work and views when they don’t conform to climate-change orthodoxy. Her bio: “I am President (co-owner) of Climate Forecast Applications Network (CFAN). Previously, I was Professor and Chair of the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology.”
This blog post isn’t about forestry or climate change specifically, but about science/research in general. She discusses an article whose authors write that “Scientists are not immune to confirmation biases and motivated reasoning.” Neither are foresters, environmentalists, and others who post on this blog, myself included, at times. Interesting insights! The entire post is worth reading and discussing.
Posted on June 19, 2019 by curryja | 99 Comments
by Judith Curry
Insights into the motivated reasoning of climate scientists, including my own efforts to sort out my own biases and motivated reasoning following publication of the Webster et al. (2005) paper
A recent twitter thread by Moshe Hoffman (h/t Larry Kummer) reminded me of a very insightful paper by Lee Jussim, Joe Duarte and others entitled Interpretations and methods: Towards a more self-correcting social psychology
Apart from the rather innocuous title, the paper provides massively important insights into scientific research in general, with substantial implications for climate science.
The Jussim et al. paper is the motivation for this blog post that addresses the motivated reasoning of individual climate scientists. And also for my next post that will address the broader ‘masking’ biases in climate science.
<begin quote>
“Getting it right” is the sine qua non of science. Science can tolerate individual mistakes and flawed theories, but only if it has reliable mechanisms for efficient self-correction. Unfortunately, science is not always self-correcting. Indeed, a series of threats to the integrity of scientific research has recently come to the fore across the sciences, including questionable research practices, failures to replicate, publication biases, and political biases.
Motivated reasoning refers to biased information processing that is driven by goals unrelated to accurate belief formation. A specific type of motivated reasoning, confirmation bias, occurs when people seek out and evaluate information in ways that confirm their pre-existing views while downplaying, ignoring, or discrediting information of equal or greater quality that opposes their views. People intensely scrutinize counter-attitudinal evidence while easily accepting information supporting their views. People generate convincing arguments to justify their automatic evaluations, producing an illusion of objectivity.
Scientists are not immune to confirmation biases and motivated reasoning. Values influence each phase of the research process, including how people interpret research findings. Reviewers’ theoretical and ideological views can influence their evaluation of research reports, leading them to judge studies that oppose their beliefs more critically than studies supporting their views. Consequently, they are then less likely to recommend publication of studies with undesired findings or funding for studies based on undesirable theories or hypotheses.
There are powerful incentives to present a strong, compelling story when describing their research. Most of us are motivated to get the science right, but we are also motivated to get the studies published and our grants funded. We want our colleagues to find our research sufficiently interesting and important to support publishing it, and then to cite it, preferably a lot. We want jobs, promotions, and tenure. We want popular media to publicize our research and to disseminate our findings beyond the confines of our lab. We might even hope to tell a story so compelling we can produce a bestselling popular book and receive lucrative consulting and speaking engagements, or have our findings influence policy decisions.
In brief, powerful incentives exist that motivate us to achieve — or, at least, appear to achieve — a “Wow Effect”. A “Wow Effect” is some novel result that comes to be seen as having far-reaching theoretical, methodological, or practical implications. It is the type of work likely to be emulated, massively cited, and highly funded.
….