Here’s the last of my 6-part series detailing 57 cognitive biases that everyone ought to be aware of and people who want to pretend to be rational have to address and purposely avoid. As intelligent people, we need to know why people believe what they believe and how to improve our cognitive functions.
So how did you do? Did you find any of these that you were guilty of? I found a couple that I fall into now and then and am going to be more cognizant of and avoid in the future. So without further ado, here’s the last 7. Enjoy.
Self-Enhancing Transmission Bias – People tend to talk about things that make themselves look good and, in contrast, not talk about things that tend to make them look bad. Therefore, if you’re talking to someone online that you’ve never met, you can be sure that they will describe themselves in glowing terms and avoid talking about their shortcomings because they want you to think well of them. It doesn’t necessarily mean that they’re lying, but humans tend to exaggerate when they talk about themselves, even subconsciously. This is true of ideas and beliefs as well. If someone believes a thing, they will tend to talk more positively, to present it in a more favorable light than it might actually be worth. People know they are judged by their beliefs and therefore, they want everyone to think that their beliefs are absolutely true and valid, even if they’re not.
Status Quo Bias – Many people will take the current situation as the baseline and any change from that baseline is seen as a loss or a failure. This bias exists strongly among the neo-cons, who want no change from the current situation, even if the change is demonstrably superior to the current situation. This does not mean that it is automatically irrational to want to maintain the status quo until something better comes along, in fact, that’s the way rational change happens, we stick with what works until something that works better is found. There is an opposite bias that I discussed in a previous post where people want change for the sake of change, whether the change is superior or not. That is equally as fallacious.
Stereotyping – Stereotyping is not necessarily bad, as some people seem to think, it is a means for humans to take large amounts of data and boil it down to a few simple and fast general rules about things. The problem comes when the stereotypes become more important than the facts, as is too often true. We need to remember that at the core of most stereotypes is a truism, a reason that the stereotype began in the first place. So long as we remember that these are general truths, not specific facts that apply equally to every member of the particular set, we can avoid this bias.
Survivorship Bias – This is a logical error of paying attention to the survivors and ignoring those who did not survive a particular process. In other words, only seeing the success and dismissing the failure. This can lead to overly optimistic views because you are only concerned with those who were successful and pay not attention to those who were not. This kind of thing is very common in advertising, where companies will purposely tell you all the good things that a product does, but none of the negative things unless forced to do so. If you listen to prescription drug commercials on TV, you can hear how all commercials ought to be if they were even-handed, telling you the positive benefits, but then the side-effects, so that you can make a rational decision. You don’t hear many breakfast cereal commercials telling you that eating their cereal might kill you, do you?
Tragedy of the Commons – People tend to act according to their own self-interest before the long-term survival of the crowd. This also can occur when people will take short term gains in the face of long-term losses because immediate gratification is more important than long-term survival. There is a reversal to this where an individual may be so concerned with potential long-term losses that they will not permit any short-term gains, just in case it might interfere with something down the road.
Unit Bias – Unit bias is the tendency to see a particular defined unit of something as the “correct amount”, regardless of whether it’s an acceptable amount or not. Take for example a bag of potato chips. People have no problem sitting down and eating an entire bag of chips because they see it as a single unit. It doesn’t matter if they’re that hungry or not, they just eat it because it seems acceptable. This is also true of things we buy, we’ll purchase things in quantities we won’t even use and don’t even need because of the way they are packaged. Advertisers often take advantage of our tendency toward unit bias to get us to buy more than we normally would.
Zero Risk Bias – We see the tendency for people to choose zero-risk options, or at least things that seem like zero-risk, over options that may be less risky overall, but are presented in a way that are less easy to understand. Humans are very good at thinking about proportions, less so thinking about differences. Given a choice to reduce deaths in a particular situation where 20 deaths occur by 50%, or by 11 deaths, many people will choose the 50% reduction, even though reducing by 11 deaths is more.
So there you go, that’s the list. There are more cognitive biases out there, but this will get you started. Think about the things that go on in your head and realize that your brain isn’t really wired to do things rationally most of the time, you have to take control of your thought processes and be responsible for how your cognitive functions work. It’s your responsibility to be rational. Saying “nobody else is rational, why should I be” is asinine and defeatist. It’s not how critical thinking works, it’s not how intellectual people work and it shouldn’t be how you work. Be the best you can be and stop allowing cognitive biases to get in the way of that.