Why People Believe What They Believe Part 4

IRRATIONAL

Welcome to part 4 of my series on cognitive biases and other irrational things that go on in our heads that we need to not only be aware of, but we need to know how to counter.  As in the past, I’ll give a short explanation of each and hope that you try to recognize these problems in  your own life and how they affect your rationality.  As critical thinking individuals, we have to be aware of potential problems that may arise in our thought processes.  Just wanting to be rational isn’t enough, we have to continually test our positions to make sure they are arrived at through rational means.

So on with the show!

Observer Expectancy Effect – Here, the observer can influence witnesses by holding and/or expressing an expectation of the witnesses.  One example of this was backmasking, the belief that Satanic messages are recorded backwards in songs.  Because people are told that there are messages in the songs, many people hear these messages where, before they were told what to listen for, they only heard random noises.  This is also common in irrational and undemonstrated practices like dowsing.

Omission Bias – When engaging the Omission Bias, people are likely to judge an action that is harmful as worse or more immoral than an equally harmful lack of action because it’s easier to see action than inaction.  A good example would be a politician who knows his political rival is allergic to a particular food.  If people are asked if it is worse that the politician gave his rival food that he knew his rival was allergic to, or if he purposely did not inform his rival that the food he was about to eat contained something he was allergic to, people tend to identify the first option as more immoral when they both lead to the same conclusion.

Ostrich Effect – Those guilty of the Ostrich Effect attempt to avoid risky or dangerous situations by simply pretending they do not exist.  While typically used in financial transactions, it does apply, for example, in faith healing cases, where the parents simply refuse to acknowledge the possibility that prayer doesn’t work and their child may die from lack of proper medical attention.

Outcome Bias – When considering a past decision where the ultimate outcome of the decision is known, individuals will often judge that decision, not on the basis of the decision itself, but on the outcome, positive or negative.  Data discovered after the decision is made should have no bearing on whether the decision was good or bad at the time.  A doctor, for instance, should decide whether a particular treatment option is warranted based on the prognosis at the time.  If an operation had a good chance of success for the particular ailment, it would have been a good decision to make, even if the patient ultimately died.

Overconfidence Effect – Some people are extremely confident of their abilities, even when their abilities are demonstrably faulty.  People tend to overestimate their own accuracy when the truth is notably less.  Someone who claims to be 99% confident in their ability to correctly answer questions, yet their answers are only 40% correct, has a problem with the calibration of their subjective probabilities.  We need to understand the reasonable limits of our abilities and not pretend, through ego or unwarranted confidence, that we are better than we actually are.

Overoptimism – Also called the Optimism Bias, this is the state of belief that one is less likely to suffer adverse effects because they are optimistic about the world around them.  For example, there are smokers who are convinced that they are less likely to contract lung cancer than other smokers, people who think that they are less likely to be victims of violent crime than others in similar situations, etc.  Much of this comes from self-presentation, the desire to present a specific image of oneself, whether that image is realistic or not.  It can also come from a desire to impose personal control on the world around them, whether or not that control is possible.  It really asserts that the individual is special and thus at less risk than anyone else, a demonstrably faulty assumption.

Pessimism Bias – This is the opposite of Overoptimism, it is the assumption that bad things are more likely to happen to an individual than to other similar individuals under similar conditions.  All of the things that I said about overoptimism apply here as well.

Placebo Effect – In medicine, the placebo effect is a demonstrable change in health or behavior that cannot be attributed to actual medication or treatment, only the assumption by the patient that such treatment has been given.  Our brains have a wonderful capacity to heal and sometimes, it only takes the assumption that we are being treated and the confidence that such treatments are effective, to trigger these healing properties and allow the body to fix itself without outside intervention.  Of course, this doesn’t just apply to medicine, people who believe, for instance, that they are being prayed for, can experience dramatic changes in attitude or experience if they believe prayer helps. It is important to recognize that outside of a desire to improve, placebo doesn’t actually do anything, it just provides a means for people to improve their own situation.

Planning Fallacy – People often have difficulty estimating how long it will take to complete a particular project, even if they have experience with the amount of time it may actually take.  Asked to estimate how long it would take to complete their thesis, only 30% of students actually accurately predicted the timetable.  Most vastly underestimated how much time it would take.

Post-Purchase Rationalization – Especially when it comes to expensive purchases, or perceived expensive decisions, the decision maker will attempt to build up their purchase or decision to rationalize the cost.  This isn’t just a matter of cars and boats, where a buyer may disregard any shortcomings because they paid a lot for the product.  People who make heavy decisions that have a lot of personal cost to them may ignore problems with their decision in order to make themselves feel better about having made it.  People tend to want to avoid admitting that they have made bad decisions for bad reasons.

Next Monday, ten more!  We’re coming down to the wire, I hope you’re enjoying this look at things we need to avoid to make good decisions.

1 thought on “Why People Believe What They Believe Part 4

  1. This is a useful series. But in this installment I think both you and the author of the original piece have mistakenly characterized the Placebo Effect as a cognitive bias. I have examined a number of lists of cognitive biases at websites much more credible than the one from which you drew the list for this series, and none of those lists, other than the one you reference, lists the Placebo Effect as a cognitive bias.
    https://sites.google.com/site/skepticalmedicine//http://en.wikipedia.org/wiki/List_of_cognitive_bihttp://rationalwiki.org/wiki/List_of_cognitive_bi

Leave a Reply

Your email address will not be published. Required fields are marked *

Optionally add an image (JPG only)