Why People Believe What They Believe Part 3

IRRATIONAL

Here’s part three of my 6-part Monday series on cognitive biases that we all should be aware of.  I got the idea from here, but it’s something we should all strive to understand and be careful with if we want to be as rational as we can.  Just because we’re programmed to believe certain things doesn’t mean those things are actually good to believe. And so, here’s my next set of ten biases to think about, things that come up all the time and that we need to consider before acting.

Hindsight Bias – I would argue that the tendency for Islam to declare that scientific discoveries are found in the Qur’an is an excellent example of this fallacy.  This deals with people “discovering” in retrospect that ideas are true and they knew it all along.  It springs from selective memory and is often associated with schizophrenia and PTSD.

Hyperbolic Discounting – When given a choice between two similar rewards, people tend to discount the later reward and give favor to the earlier, simply because it comes first.  Even if the reward is greater if one waits, people tend to place value on the smaller reward, just because it comes sooner.  As the amount of time the subject has to wait increases, their tolerance for slightly longer waits increases as well.  If they are told that they can get ten dollars today, or thirty dollars tomorrow, a significant number will choose the ten dollars today.  However, if you tell them that they can have ten dollars a month from now, or thirty dollars a month and a day from now, a smaller number will pick the ten dollars, preferring to wait the additional day for the larger amount.

Ideomoter Effect – The human body may sometimes take action, seemingly without being commanded by the brain.  This likely explains things like Ouija Boards and dowsing, where people are subconsciously maneuvering objects without being consciously aware they are doing it.  In fact, people are often so unaware that they utterly refuse to acknowledge the possibility that they are influencing the outcome of the experiment.

Illusion of Control – I’ve talked about people’s ability to under-estimate the control they might have over a particular situation, but the Illusion of Control addresses the opposite, people can vastly over-estimate how much control they have in a particular situation, to the point that they assert that they are actually responsible for things that they simply cannot be responsible for.  This is especially true with many superstitions and belief in extrasensory perception, etc.  People think that they have special powers that let them change the results on dice or that performing a ritual will help their sports team win the big game, etc.  Feedback loops play a particularly strong part in this, if a person takes a particular action and a favorable result comes about, they may link that action with that result and continue to perform one in order to get the other.  Of course, this is subject to confirmation bias, the ability to remember the hits and forget the misses, when things really operate no better than random chance.

Illusion of Validity – We find this commonly when people think that getting additional data points that provide no more actual evidence, will provide for a better result.  Essentially, it’s data for data’s sake.  If having 50 points of data is enough to come to a conclusion, having 100 points doesn’t make the conclusion any better.  However, there are some who insist on increasingly large data sets, just  because they don’t ever want to come to the conclusion that the data seems to indicate.  This is very true of the religious, especially creationists, who want an ever-increasing data set for evolution because they do not want evolution to be demonstrated, even though it’s absurdly clear that it has been.

Information Bias – Information bias results from an individual insisting that any data, even irrelevant data or potentially faulty data, is better to acquire before coming to a conclusion than less data, even if it is all relevant. This is similar to the Illusion of Validity, in that people who are guilty of it think that more data is better, just because there’s more of it.  In rational studies though, only valid data, data that directly speaks to the thing being studied, makes any difference at all.

Inter-Group Bias – Also called in-group favoritism, this is the tendency to give members of your own group, whether social, religious, sexual, racial or whatnot, more credence than people who fall outside of your  group. The reasons for this are numerous, through our evolution, competitive pressures between groups have been commonplace and those who are with you, typically aren’t against you, therefore you work harder to protect and defend those who are on your side of the conflict.  There is also a modicum of self-esteem involved, you tend to see yourself in those who are similar to yourself and therefore, feel better about supporting those who look most similar to yourself.  The lower one’s self-esteem, the more likely they are to rely on the group dynamic as a substitute for their own self-worth and be more likely to value in-group vs. out-group dynamics.

Irrational Escalation – This is sometimes referred to as Escalation of Commitment, or the Sunk-Cost Fallacy. As one becomes invested in a particular proposition, they become more strongly committed to their current position, even if it becomes clear that they’re wrong.  The more wrong it appears they are, the harder they cling to their current belief, doubling down on the losing side as it were.  In U.S. politics, many think that recent military actions owe much to the sunk-costs of building up the military during the Cold War era.  We’ve already spent the money, we might as well use the hardware!

Less-is-More Effect – Sometimes called the less-is-better effect, it results from people choosing a lesser option when presented in a certain way than a clearly better option.  Dan Ariely, whose book I reviewed recently, studied this effect.  People, for example, if offered two cups of ice cream, will often take the smaller amount of ice cream if offered in a small cup, over a larger amount of ice cream offered in a large cup.  The size of the cups makes them think that more is actually less than the lesser amount.

Negativity Bias – Many people will more easily or strongly recall negative experiences than positive experiences, thus attempt to avoid future negative consequences much more strongly than to re-experience future positive events.We do understand that there is more electrical activity in the cerebral cortex while viewing negative images than there is while viewing positive images.  We also know that learning takes place much faster in cases where a negative influence is used, compared to a positive influence.  We also know that when humans distinguish things or people from one another, it is the negative aspects that stand out, not the positive or neutral ones.  Your brain looks for things that are wrong with a person’s face to distinguish it from a different face.  We also find negative information to be more credible than positive information.  In marketing terms, a good experience may cause an individual to tell one or two friends about it.  A bad experience will cause an individual to tell upwards of ten people about it.

That’s it for this week.  I know that it’s difficult because a lot of these things are evolutionary relics of our human brains, they once fulfilled a purpose in our survival, but today, most are simply useless vestigial patterns in our heads.  However, we have the ability to over-ride these patterns and recognize them for what they are, so that we can make better, more rational decisions.

 

2 thoughts on “Why People Believe What They Believe Part 3

Leave a Reply

Your email address will not be published. Required fields are marked *

Optionally add an image (JPG only)