Toggle navigationOpen menu
Researcher Development

Researcher Development

Barriers to critical thinking

It is always useful to remember that engaging in critical thinking, like any skill, does not come naturally to everyone and can require strategies and practice. That is especially because we face many barriers to critical thinking on a regular basis, such as our own biases and assumptions. However, being aware of our biases and assumptions helps bring us one step closer to overcoming them. Here are just a few examples of biases and assumptions that we may encounter as researchers.

Logical Fallacies – Errors in our reasoning

  1. Ad Hominem Fallacy: This is when we attack the person rather than addressing their argument or conclusion.
    As researchers, this may happen when: We might have had a personal disagreement with a fellow researcher or colleague, but instead of settling this, we become unfairly over-critical of their research findings at their next presentation.
  2. Single Cause Fallacy: This is when we jump to a conclusion or generalisation based on a single instance or too little evidence.
    As researchers, this may happen when: We over-attribute one research finding as being able to explain a complex, multifaceted phenomenon.
  3. Bandwagon Fallacy: This is when we agree with or assume something is true because it’s in accordance with the beliefs of the majority.
    As researchers, this may happen when: We fail to scrutinise a study, a theory or a methodology because everyone in our research group favours it.

Graphic: Illustrated example of single cause fallacy, source: https://crankyuncle.com/cranky-uncle-cartoons/.

Heuristics – Mental shortcuts that can result in cognitive biases

  1. Availability Heuristic: When we assume or favour something based on information that is more easily accessible to us or comes to mind more quickly.
    As researchers, this may happen when: Rather than evaluating the best methodology for answering our research question, we choose the methodology that is most convenient to use.

 

  1. Familiarity Heuristic: This describes our tendency to prefer familiar information over novel information, and regard the familiar to be more like likely to better or truer.
    As researchers, this may happen when: We choose research for our literature review that we have already read before, rather than searching for new pieces of research.

 

  1. Representativeness Heuristic: This happens when we make judgments or categorise things based on how similar they are to what we already know.
    As researchers, this may happen when: We do not have enough information about a subject matter to make an informed conclusion, so we rely on stereotypical judgments.

Cognitive Biases – Deviation from rational judgment

  1. Confirmation Bias: This is when we selectively seek or favour evidence that aligns with our existing beliefs.
    As researchers, this may happen when: We decide to not acknowledge any findings or data that is not consistent with our predictions, in order to make it look like the findings were all as predicted.

 

  1. Hindsight Bias: This involves over-estimating our ability to foresee certain events as being predictable.
    As researchers, this may happen when: We claim that everything we found in our study’s results was what we had predicted all along, even when that is not 100% true!

 

  1. Fundamental Attribution Error: This describes our tendency to blame others, or external events, when things go wrong – rather than looking at the situation objectively.
    As researchers, this may happen when: We might blame a co-author, colleague, or even ourselves, for a paper being rejected from a publisher, or an experiment going alternatively to plans – when there may be some very valid reasons for why those things happened, that are not linked to any one person.

It is quite common and very easy to engage in the above thinking patterns. We are all susceptible to making errors in our reasoning in certain situations. Mental shortcuts are convenient when you are constantly inundated with information, and you may not always have the time or mental resources to make more informed and reflective decisions. And the cognitive biases that occur as a result of those shortcuts we take can make rational judgment that much more difficult.

As researchers, we need to be especially careful of these thinking patterns, as they can compromise the objectivity, reliability and validity of our research process. It can also lead to faulty techniques and inaccurate, unjustified conclusions. So what can we do to make sure we are fair and objective in our research process? Here are some suggestions for how we can overcome these thinking patterns and better apply critical thinking:

  1. Exercise intellectual courage: This is one of the intellectual traits from the Paul-Elder framework from Section 1.2. This involves being brave enough to question the information you are presented with, question your own beliefs, and encourage others to challenge you. With intellectual courage, if you have a strongly held belief that does not have evidence to support it, you are able to loosen that belief. We can also all take inspiration from intellectually courageous role models like Galileo and Sojourner Truth.
  2. Give the benefit of the doubt: Taking something at face value, without investigating it first yourself, can lead to faulty and misled conclusions. Sometimes, when we make decisions or judgments quickly, without the chance to get evidence, we can easily succumb to our mental shortcuts or cognitive biases. Inviting doubt, and slower, reflective thinking based on evidence can save you from making irrational, ill-informed decisions.
  3. Be comfortable with contradiction: Sometimes, especially as researchers, we become preoccupied with providing answers and simple truths. A critical thinker understands that the world is ever-changing and complex, and we need to be able to change our minds in light of new evidence. This can happen both when our own research presents us with contradicting findings, or when a new piece of evidence contradicts a viewpoint we hold ourselves. The psychological discomfort this can cause us can even result in further erroneous reasoning or cognitive dissonance. But, we can become comfortable with contradiction and we can even hold contradictory beliefs. We can admit that we don’t have all the answers – because it is impossible!

At the beginning of this resource, we covered the history of critical thinking, in Section 1.1. We now return to the origins of critical thinking, and Socratic Questioning. Here is an infographic that you can download as a pdf to use whenever you need to use the Socratic method of inquiry to unlock your critical thinking.

Activity: Remember the critical thinking self-evaluation spreadsheet at the beginning of this resource? You can either do the self-evaluation again, or go through your score to see how you feel now in relation to the same statements. Do you feel you have developed your critical thinking skills in some way since starting the resource? Do you feel as if there is still room for improvement? Perhaps you can make note of what you want to work on. As individuals who strive to become better critical thinkers, we are constantly self-evaluating and giving ourselves new ways to improve. Identify your 3 key areas of improvement, or action points that you will aim to work on going forward. Good luck on incorporating critical thinking to the rest of your research journey!

IconWe value your feedback, and would appreciate it if you could take 2 minutes to give us some feedback on this resource

×