Decision-Making: Cognitive Bias

Doctor Karl

Welcome to our discussion of Cognitive bias. This is part of a series on Decision-Making. An introductory paper gave an overview of the decision-making process and the importance of a scientific approach. Here we will consider some cognitive biases that can affect our decisions. It is important to remember that serious decision-making is a difficult task that does not need to be used for every life decision but should be used for important ones. Watch out for emotional decisions. Our emotions often are short-sighted, over-optimistic, and not sufficiently risk-averse. 

The concept of cognitive biases was studied in depth by Nobel Prize winners Daniel Kahneman and Amos Tversky and now remains under study by scientists and economists worldwide. These factors influence decision-making and draw us from a scientific approach to logical decisions. 

Is cognitive bias always bad?

Not at all. Some cognitive bias allows us to make decisions quickly. An example I like to use is that of a fighter pilot who hears an alarm indicating that an incoming missile is just seconds away. The pilot could consider options, the physics of two high-speed aircraft on an adaptive intercept course, but it would be unlikely that the pilot would make an evasive maneuver before exploding. Instead, the pilot relies on long-learned skills and previous experience, simulated and real. The pilot banks hard to one side, preferably closing the angle of attack and releases a decoy. The response was learned through years of training and practice and might not be appropriate in every context, but it saved the pilot’s life this time. However, many cognitive biases do not work in our interest. 

We have briefs on select biases, but let’s discuss a few. 

  • Anchoring is where we tend to use the same answer to various questions. A surgeon who feels that every patient they see needs the procedure the surgeon knows how to do is anchored.

  • Framing is when someone affects our response by giving additional data that might be irrelevant. For example, if someone offers to sell us a widget for $25 we are less likely to buy it than if they provide us a device that they say costs $50 for only $25.

  • Confirmation bias is where we are most likely to agree with a statement or concept if it is coherent with our prior beliefs or decisions. For example, if a news article is critical of a person or group we identify with, we are more likely to brand it “fake news.”

  • Implicit bias is giving attributes to groups where it is not necessarily appropriate. For example, we might assume people of a specific political party are all stupid or dishonest.

  • Lastly, Overconfidence is a serious and pervasive bias. Many of us assume that our thought processes and decisions are correct unless someone else is to blame for giving us wrong information or influencing the consequences of a decision. We start thinking we are perfect and spend the rest of our life learning otherwise.

Why do we have cognitive biases? 

Part of it is evolution. Not all biases are harmful. We believed much of what our mother taught us from youth. “Don’t jump down that cliff.” If we tested every one of her cautions, we would never have made it to adult life. We were correct to accept many of her teachings without other data. 

So how do we deal with cognitive biases? 

We cannot reject them all. But we should be aware of them. Suppose we are a member of a group, political party, or family; in that case, we should not assume that all in our group are flawless, omniscient beings, incapable of error or deception. Similarly, we should not assume that a differing group is all evil. Life and humans are not so simple. 

We should be keenly aware of personal bias when we detect it. Look beyond it, especially when mission-critical issues are at stake. Keep an open mind. Use rigorous decision-making. 

And if we do make a mistake or bad judgment, we admit it to ourselves and others. Fix the outcome if possible. And don’t make the same mistake again. 

  • Be smart!

  • Take an analytic approach to decision-making!

  • Be the scientist!

Thank you for your time and attention to our material. Subscribe to our media and leave constructive comments so we can do even better. And if you would like to contribute, contact us. We would love to hear from you and learn what you have to say.  

…………………………………………………………………………………….

Doctor Karl

Doctor Karl is Karl Edward Misulis, a physician, scientist, educator, and university professor. He has a BSc from Queen’s University Canada, MD from Vanderbilt University, and Ph.D. from SUNY Syracuse. He has authored more than 20 books, some in multiple languages, and lectured worldwide to the public and fellow academics. 

Previous
Previous

Job Market Trends in 2022

Next
Next

How to Get Followers on Your LinkedIn Page - Top 10 Most Followed Company Pages