I have a sinking sense that schools don’t teach much “evidence-based thinking” anymore. They do teach critical thinking which is related but students are remarkably poor at defending propositions and recognizing thoughts and beliefs worth having. Although here is a blog site devoted to evidence-based thinking by two energetic young fellows. This spills over into the deliberative process because many citizens and political activists suffer from some of the same deficiencies. For some time now we have seen the diminution of the effects of Enlightenment thinking and science. From religious extremists to Tea Party members there’s plenty of anti-rationalist thinking and pseudo-intellectual discourse.
But things get worse. There is a clear disdain for logic and reasoning in some circles with many holding a toxic dependency on popular culture. This is not a particularly new phenomenon because Richard Hofstadter’s Anti-Intellectualism in American Life published in 1963 began to note the decline of the principled and evidentiary intellectual society and its replacement – the persuasion of single-minded people tenaciously holding onto a belief, opinion, or feeling and doing nothing but looking for support for that belief rather than its improvements, or accuracy, or truth value. I term this, the backside of evidence based thinking, confirmatory thinking in honor of the confirmation bias which is that people favor information that confirms what they already believe. But it gets worse. I count four ways that thinking is deficient because it is not evidence-based.
The difference between coincidence and causality is sometimes not completely clear but important to understand. Those who oppose vaccinations believe that all drugs have negative effects and any new vaccination would be the same thereby confirming their depreciated knowledge about medicine. They hold a mistrust of government and consequently any government program – even a highly evidence-based program that saves children’s lives – is rejected. A consistency of their own belief is more important than the evidence that supports the value of vaccinations.
A second sign of diminished capacity for evidence is simply how science and methods for making decisions work. The past and knowledge is strewn with failures and disappointments. Even when studies are unsuccessful and wrongheaded they make for a certain amount of information that is still of scientific value. It is comparable to the quip attributed to Edison that after he failed 200 times to make a light bulb he was not frustrated because he learned 200 ways not to make a light bulb. Evidence-based thinking requires developmental and evolutionary attitudes towards the unfolding of better and more precise information. Even though information can be wrong and lead you down wasteful paths, these paths are part of the process.
Basic misunderstanding of the logic of research is also a third issue. I’m not talking about standard logic or sophisticated mathematics but about basic principles of research and conclusions based on quantitative data. What it means for something to have a mean (average) and variation around the mean. Or, to have a sense of why and how numbers are influenced and change over time. This would include the logic of the experiment and the quality of conclusions when conditions are controlled and only a single experimental factor could have caused variation.
Lastly, one of the easiest ways to never get out of your own head and to hold fast to wrongheaded beliefs is to dispute, challenge, and dismiss those who are credible; in other words, to care more about maintaining your own consistency by rejecting experts and those more knowledgeable. We cannot all be experts on scientific and political matters so we must often rely on the expertise of others. And even though challenging and checking on the credentials of others to ensure source reliability is an important critical stance, this is not the same as knee-jerk rejection of experts. It seems as if those on the conservative end of the spectrum are quick to label all sorts of science as biased against the environment, or the climate, or finance because they inherently mistrust the source of any information and easily gravitate toward rejecting inconsistency with their own ideas then truly exploring and integrating new information.
There are all sorts of ways to distort information or disengage from it. And even the most conscientious thinker allows biases to creep in. But the attitude and willingness to engage, integrate new information on the basis of sound evidentiary principles, and change as a result of this evidence is what makes for more rigorous thinking.