It’s hard to deny that over the last two years, we all have become more acquainted with conspiracies by direct experience. According to the CNN, we’re living the “golden age of conspiracy theories” – although some scholars argue that they’ve always been around, and that it’s just technology and journalism that are drawing more attention to them.
But why do we get the feeling that they are so widespread that even our relatives end up believing them? A relatively new branch of studies in psychological research might have the answer. Several researchers have found that people who commit certain errors in logic are more likely to believe conspiracy theories. These mistakes are called cognitive biases. A cognitive bias is defined as a systematic error (which means that it happens all the time, under certain conditions) that people commit while reasoning or making predictions and choices. It’s a human flaw: it belongs to all of us.
There are several types of cognitive biases, but let’s make an example. For simplification purposes, let’s say that one has to choose between two possible explanations for the COVID-19 pandemic:
The most evidence-based and most accredited explanation so far posits that the virus comes from a spillover: it probably comes from bats or another mammal.The other explanation, instead, posits that the virus is a bioweapon manufactured to force people to get vaccinated and depopulate the planet – which, unironically, is a well-known conspiracy theory, called The Zero Carbon Solution theory.
Now, if someone is patient enough to gather all the information from the scientific sources, and they trust these institutions, then they would probably support the spillover hypothesis, while still keeping open other possible explanations. But if someone doesn’t meet these requirements (because of their limited cognitive resources and lack of trust), they could probably end up supporting the bioweapon hypothesis for a phenomenon called proportionality bias, which is the tendency to think that big events must have big causes. It’s undeniable that the pandemic has had a disruptive impact on our lives on a global scale, and for those who make this cognitive mistake, it’s hard to believe that this huge catastrophe has happened just because of a small bat. It must be bigger than that. The psychologist Patrick Leman has found experimental evidence that conspiracy theorists tend to make this logical mistake.
But that’s not the only possible error contributing to such beliefs. One feature that is common to all conspiracy theories is agency: nothing happens randomly, but everything happens because it was someone’s plan. The psychologist Karen M. Douglas and her colleagues have found that people who endorse conspiracy theories, on average, tend to over-attribute the cause of random/natural events to human intentionality. In our case, this cognitive mechanism would lead someone to endorse the bioweapon hypothesis rather than the spillover hypothesis. This phenomenon is called intentionality bias and it’s consistent with the so-called Cheating Detection Theory, which posits that humans have a natural adaptive ability to detect cheaters among the people in the society.
But imagine that the individual in our example has found out that in 2018, Bill Gates said during a conference that the world needed an emergency plan to be prepared for a pandemic. Now they are aware of two unrelated facts: a) the existence of a pandemic that started at the end of 2019; b) the fact that Bill Gates was well aware of the global risks of a pandemic back in 2018. Don’t you feel the temptation to join the dots? Indeed, another common reasoning error that humans make is perceiving patterns and causal links where there are none. This phenomenon is known as causal conjunction fallacy, which happens when two co-occurring unrelated events are considered to be causally connected. In 2014, the psychologist Robert Brotherton has found that conspiracy theorists are more susceptible to this kind of cognitive bias.
It must be noted that these pattern-perception mechanisms are not based on evidence, probabilities or data, but they’re driven instead by a judgement of similarity: the conspiracy theory is considered as a more representative explanation of the pandemic, based on some stereotypes we might already have in our mind (such as racist prejudices, anti-establishment attitudes, etc.). This process is a mental shortcut known as the representativeness heuristic, proposed by the famous psychologists Daniel Kahneman and Amos Tversky.
Finally, another fundamental cognitive bias for conspiracy theorists is confirmation bias. While the first three biases we reviewed operate in a phase of formation of the beliefs, confirmation bias kicks in afterwards. It occurs when individuals systematically seek corroborating evidence that is consistent with their prior beliefs, thus solidifying their endorsement of conspiracy theories. It’s a well-known phenomenon that is very relevant to conspiracy theories, as it might make believers more immune to debunking and facts.
Now, all of these biases reviewed so far are quite widespread (they’re supposedly universal). If it’s true that these mechanisms can lead people to believe conspiracy theories, then we would expect to see conspiracy theorists everywhere. And, despite their diffusion, that is not the case. To explain this, we need to take into account a couple of things. First, different studies show that there are individual differences in susceptibility to cognitive biases. Some people are simply more prone to commit such errors, depending on different factors: cognitive abilities, age, culture, and so on. Also, recent research suggests that, on average, people with anomalistic beliefs – not simply conspiracy theories, but also paranormal beliefs, superstitious beliefs, pseudoscientific beliefs – tend to make more cognitive mistakes than others. Several studies show how all of these “epistemically suspect beliefs” are correlated, suggesting that it’s not simply a complex system in which beliefs are mutually reinforced, but it’s more like there is a general underlying cognitive mechanism that makes certain individuals, who rely more on intuitive thinking rather than analytic thinking, more prone to be gullible.
Second, susceptibility to cognitive biases can be mitigated, as several studies in debiasing have shown. This means that people are not bound to be endlessly biased: we can be trained to correct our mistakes. Simply asking individuals to reflect on why their initial judgements could be wrong (a strategy called consider the opposite) seems to mitigate cognitive biases, but there are much more sophisticated tools that serve the same purpose, even videogames.
Nevertheless, the problem is that it’s not easy to interact with conspiracy theorists, especially the most radicalised ones. For them, it’s not simply a matter of logical errors in calculating probabilities: ideology plays a big part too.
Still, unless your uncle or your aunt are already too far down the rabbit hole, you could still try to talk them out of their biases, once you have recognised your own.
As humans, we are all susceptible to the biases that lead some to believe in conspiracy theories, to varying degrees
The post Could any one of us become a conspiracy theorist? appeared first on The Skeptic.