Welcome to part 5 of my series on cognitive bias, the things that any rational, critical-thinking individuals must be aware of in order to keep from falling into cognitive traps and unsound thinking patterns. If you missed the first four parts of my series, check the last 4 Mondays and you’ll find them.
It’s really a shame that so many atheists, so called critical thinkers and skeptics, fall into these traps so often. Many times, they are simply unaware that these biases exist, others, they are programmed by their worldviews to ignore them entirely because they get in the way of a favored emotional conclusion.
Therefore, please read through this set of 10 cognitive biases and see if you can find them playing havoc in your own mind. The only way to be rational is to recognize your own irrational failings.
Pro-Innovation Bias – This is very common, I see it among liberals and so-called “progressives” a lot. It is the belief that any social innovation should be accepted by society immediately and without question or alteration, because it is seen by its supporters as important and imperative. Those behind an idea tend not to be able to see the idea’s weaknesses or limitations and therefore push the idea regardless of how well it might actually work.
Procrastination – We all know what this is, the tendency to put off less pleasurable tasks in order to avoid hard work or undesirable activities as long as possible. However, there is a cognitive element here as well, we often put off difficult thought processes, or avoid thinking about uncomfortable things for as long as possible. Unfortunately, this doesn’t necessarily stop us from making decisions without considering all of the factors, we simply make them ignorantly, based on incomplete evidence or ideas uncritically accepted. We may also put off making a decision as long as we can, exacerbating other factors and making the process more difficult in the long run.
Reactance – When one feels they are being pushed to accept a particular idea or concept as true, they may over-react and adopt an opposite position, simply out of obstinance, instead of being convinced that the opposite position is actually valid. Many people who understand reverse psychology will attempt to use this cognitive bias by pushing the position they do not favor in hopes that the individual will accept the position that they do.
Recency – An idea which has been recently encountered will be favored over one that has been less recently encountered. As ideas become less and less encountered, they become less favored because they are not reinforced in the mind. This really has nothing whatsoever to do with whether the idea is valid or not, we’re just programmed to think that ideas that we encounter consistently are more important. This may be why religion relies so heavily on memorization and recitation, it keeps reinforcing the same ideas over and over again so that the individual automatically sees those ideas as true.
Reciprocity – In the cognitive sense, this is the belief that people will think in similar ways, therefore if you find a particular idea or belief compelling, that everyone else ought to to do as well. This is a very serious problem that I see constantly, people acting like because they like an idea, that idea must be true, simply because everyone ought to think exactly the same way as they do. Ideas are not true because you favor them, but on their own merits and based on independent evidence and critical evaluation of the validity behind the idea.
Regression Bias – Also known as the regression fallacy, this is the belief that a trend will continue, simply because it is a trend. It doesn’t take into account the natural fluctuations which occur in any system which throw any claims of tendencies out of balance. Trends only show where things may go, not where they absolutely will go.
Restraint Bias – Restraint bias is the tendency for people to overestimate their ability to control their impulses, such that they are convinced that they can avoid potential downsides of actions or beliefs, because they think they are able to override those impulses better than others. Physical examples of this may be addictive behaviors like drugs or alcohol or smoking, where the individual thinks that they could never be addicted to these things, therefore it is safe to experiment with them. This is also true of psychologically addictive behaviors where people may overestimate their ability to be affected negatively.
Salience – In the realm of cognition, salience refers to things that stand out from those around them being given extra attention, just because they stand out. Some people will give extra importance to an object or an idea, just because it is different in some way from it’s neighbors, even if that difference has no bearing on the question being considered. This tends to lead people astray by focusing on the variation instead of on the actual content. I see this in Christians who will claim that Christianity is special because it is “different” in a few minor ways from other religions, while ignoring the overwhelming majority of ways that it is exactly identical. They will also entirely refuse to acknowledge differences between other religions and their own which might lead adherents of other religions to claim their beliefs are special.
Seersucker Illusion – First written about by Scott Armstrong, he describes it as “No matter how much evidence exists that seers do not exist, suckers will pay for the existence of seers.” People tend to want to believe that there are others who can provide information about their future or make them feel good that their actions are going to lead to positive results. This isn’t just about psychics and other hucksters, it can also apply to stock analysts and political experts who really have no better track record than random chance, they just present themselves as professional prophets for a fee.
Selective Perception – Again, this is extremely common, it is the tendency to ignore or quickly forget stimuli that causes emotional pain or contradicts existing beliefs. This is very true in religion, politics and social belief structures, an individual may see in the media those things that support their pre-existing beliefs and forget everything that disagrees with them. This is a bias that many psychologists think is automatic in the brain, but that doesn’t mean it’s not hazardous to critical thinking. We have to be careful to see all of the evidence and evaluate it rationally, regardless of whether it supports our views or not. To do otherwise is irrational.
And there you have it, another ten cognitive biases to be aware of and to consciously avoid. Next week, I’ll present the last seven biases and wrap this series up. Do you remember them all? Have you noticed yourself being affected by any of them? Let me know in the comments.