It is always useful to remember that engaging in critical thinking, like any skill, does not come naturally to everyone and can require strategies and practice. That is especially because we face many barriers to critical thinking on a regular basis, such as our own biases and assumptions. However, being aware of our biases and assumptions helps bring us one step closer to overcoming them. Here are just a few examples of biases and assumptions that we may encounter as researchers.
Logical Fallacies – Errors in our reasoning
- Ad Hominem Fallacy: This is when we attack the person rather than addressing their argument or conclusion.
As researchers, this may happen when: We might have had a personal disagreement with a fellow researcher or colleague, but instead of settling this, we become unfairly over-critical of their research findings at their next presentation. - Single Cause Fallacy: This is when we jump to a conclusion or generalisation based on a single instance or too little evidence.
As researchers, this may happen when: We over-attribute one research finding as being able to explain a complex, multifaceted phenomenon. - Bandwagon Fallacy: This is when we agree with or assume something is true because it’s in accordance with the beliefs of the majority.
As researchers, this may happen when: We fail to scrutinise a study, a theory or a methodology because everyone in our research group favours it.
Graphic: Illustrated example of single cause fallacy, source: https://crankyuncle.com/cranky-uncle-cartoons/.
Heuristics – Mental shortcuts that can result in cognitive biases
- Availability Heuristic: When we assume or favour something based on information that is more easily accessible to us or comes to mind more quickly.
As researchers, this may happen when: Rather than evaluating the best methodology for answering our research question, we choose the methodology that is most convenient to use.
- Familiarity Heuristic: This describes our tendency to prefer familiar information over novel information, and regard the familiar to be more like likely to better or truer.
As researchers, this may happen when: We choose research for our literature review that we have already read before, rather than searching for new pieces of research.
- Representativeness Heuristic: This happens when we make judgments or categorise things based on how similar they are to what we already know.
As researchers, this may happen when: We do not have enough information about a subject matter to make an informed conclusion, so we rely on stereotypical judgments.
Cognitive Biases – Deviation from rational judgment
- Confirmation Bias: This is when we selectively seek or favour evidence that aligns with our existing beliefs.
As researchers, this may happen when: We decide to not acknowledge any findings or data that is not consistent with our predictions, in order to make it look like the findings were all as predicted.
- Hindsight Bias: This involves over-estimating our ability to foresee certain events as being predictable.
As researchers, this may happen when: We claim that everything we found in our study’s results was what we had predicted all along, even when that is not 100% true!
- Fundamental Attribution Error: This describes our tendency to blame others, or external events, when things go wrong – rather than looking at the situation objectively.
As researchers, this may happen when: We might blame a co-author, colleague, or even ourselves, for a paper being rejected from a publisher, or an experiment going alternatively to plans – when there may be some very valid reasons for why those things happened, that are not linked to any one person.