Why People Believe What They Believe Part 1

IRRATIONALI talk a lot about stupid and irrational beliefs, about people who believe things for bad reasons, who refuse to think critically or rationally about the things they allow into their heads, but it’s relatively rare that I look at the actual causes of these bad beliefs.  There are actually a lot of reasons why people give credence to these ideas and I thought it was high time I took an extended look at many of them.  There’s an article over on Business Insider, 57 Behavioral Biases that Make Us Think Irrationally.  There isn’t much detail there so I’m going to actually address all of them over a series of 6 posts, one for the next 6 Mondays, and try to spell out some of the problems and why it’s important that we, as rational human beings, need to be aware of them and know how to avoid falling into the trap.

Attention Bias – Attention bias refers to our tendency to continue to pay close attention to things that we already think are important.  This is true of everyone, myself included.  Just to use myself as an example, I spend a considerable amount of time thinking about religious horrors, due to the Religious Horror Show, therefore I tend to pay closer attention to such things than someone who doesn’t find the subject matter interesting.  Now that doesn’t mean that religious horrors are somehow overblown, just because I pay attention to them, clearly they exist in great quantities

Availability Heuristic – This refers to the reliance on easily recalled examples, assuming that because we can remember something, it must be important.  This has the side effect of people placing more importance on recent events that are fresh in our minds, just because they are more easily recalled.  This changes a perceived view of statistical averages, what you can more easily recall becomes viewed as the norm, which simply may not be the case.

Backfire Effect – The Backfire Effect is a specific form of Confirmation Bias which is viewed especially strongly in the religious, although it applies to any strongly held belief, such as strong political beliefs.  The Backfire Effect takes place when a believer is presented with evidence that strongly refutes their beliefs, yet instead of adapting their beliefs to the new data, they will reject the new data and strengthen their resolve to continue believing their previous belief even stronger than before.

Bandwagon Effect – The Bandwagon Effect refers to the belief that a position is more likely to be true based on how many people already accept it.  It is a variation on the argumentum ad populum fallacy and comes from a desire to be socially acceptable by holding similar beliefs to your neighbors.  In fact, this is really why we have so many people claiming to be religious, they think it makes them look better to the people around them, whether or not they actually hold those beliefs.  I’ve labeled these people “social Christians” for that reason.  The Bandwagon Effect can come either from within or without, it can be imposed on people due to social pressure, or it can be individuals trying to fit in with the group.

Belief Bias – People are naturally biased to accept evidence, based on the conclusion being plausible or desirable.  I see this all the time in religion, where Christians, say, will accept claimed evidence of personal experiences with their God, while rejecting the exact same evidence of personal experiences with other gods, just because they want their belief in God to be true.  So long as the conclusion is either desirable or sounds reasonable, people tend not to pay much attention to the evidence that is claimed in support of it, they just want the conclusion to be true, whether it’s actually well supported or not.

Bias Blind Spots – This may be one  of the most important biases and one that most people suffer from unless they are very careful.  It refers to blind spots that many, perhaps most people suffer from because of their own inherent internal biases.  It requires one to be constantly introspective of what you believe and why you believe it and be willing to test all beliefs in light of new information.  It’s a never-ending process and one that I’ve spoken about before because it is so important.

Choice-Supportive Bias – Choice-Supportive Bias is the tendency to go back and assign positive characteristics to something that you’ve already selected, a choice that you’ve already made and now want to make look better to those around you.  It is a form of confirmation bias and cognitive bias.  For example, if a person buys an Apple iPhone instead of a comparable Android product, they are more likely to ignore or forgive any faults with the Apple product and instead, build up and overstate the positive aspects while ignoring or downplaying the positive aspects of the Android product.  How we remember the choices we make is influenced by our internal desire to have made the correct decision and we tend to remember, or invent, the best consequences from our choices and ignore the worst consequences.

Clustering Illusion – Humans inherently seek patterns, the Clustering Illusion is our tendency to give those patterns meaning, even if they are nothing more than a random set of data.  We may see patterns in a random string of numbers or points on a graph and we think that these patterns, which are almost always illusory, have some significance that they do not, in reality, have.  We already recognize such pattern-recognition errors as pareidolia and apophenia, this is an overall internal bias that is related.

Confirmation Bias – Confirmation bias is extremely common, it is the tendency for people to only accept information that confirms what they already believe and reject any other data out of hand because it doesn’t support their preconceived position.  The more strongly the belief is held, the more emotional the issue is, the more likely that Confirmation Bias will take place.  There are several reasons suggested for Confirmation Bias, including wishful thinking, the tendency for people to examine their beliefs only from a self-indulgent perspective and the consideration of the social and personal costs of being wrong in a belief.  Many people are so adverse to being open to public humiliation and embarrassment if their strongly held belief is shown to be wrong, that they’d rather continue to believe a false position than admit that they were ever wrong in the first place.

Conservatism Bias – This is the tendency of humans to over-emphasize the importance or relevance of past events and the status quo while under-estimating the importance of new events and ideas.  This has the effect of slowing the change of ideas and opinions based on new evidence, even when the new evidence is compelling.  There is a reverse of this bias, where people will vastly over-estimate the importance of new data, just because it is new. It doesn’t allow for people to judge the importance or effect of the new data, or the weight of the old data, they just leap from conclusion to conclusion because of the hottest new thing on the block.

Please come back in a week to see the next ten cognitive biases that we, as rational people, need to be aware of.  If we hope to be intellectual, rational and critical thinking, we cannot allow ourselves to fall victim to any of these problems.  Far too many people do, unfortunately, and maybe by making people aware that these biases do exist, they can help to override the problems that our brains may cause.

 

1 thought on “Why People Believe What They Believe Part 1

  1. I THOUGHT THIS MAY BE OF INTEREST TO YOU

    “I know that most men, including those at ease with problems of the greatest complexity, can seldom accept even the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have delighted in explaining to colleagues, which they have proudly taught to others, and which they have woven, thread by thread, into the fabric of their lives.”

    Leo Tolstoy

Leave a Reply

Your email address will not be published. Required fields are marked *

Optionally add an image (JPG only)