Tags

, , , , ,


This post is in continuation of lessons learned from Daniel Kahneman’s book “Thinking Fast and Slow”. Previous post Intuition gives confidence, but not a predictor of accuracy, discussed the concept of “System 1” and “System 2” and their influence on biases. Kahneman also discussed confident decisions and expert forecasts where decisions are not rational and we fail to distinguish confidence from overconfidence. We can find experts everywhere, who claim that they make better intuitive decisions instead of applying statistical formulas. Research suggests that most of these cases are self-delusional. In situations where stable regularities exist and where experienced experts have opportunities to test and validate their assumptions, system 1 can develop capabilities to recognize the matching patterns to make judgements. In these stable regularity situations, experienced experts’ intuitions can be trusted. This post discusses the reasons and pitfalls of confidence and what we can do to distinguish confidence from overconfidence.. 

Narrative fallacy – We like to hear and learn from the success stories. Learning from success stories is great, but this is also the reason to fall into the narrative fallacy. Most success stories skip the role of luck in narration – call it uncertainty or randomness if you like mathematical terms. We build simple narratives from the stories we know and build the relationships with situations we face. For example, we develop a narrative that good people always do good and bad people always do bad. We believe in this narrative because that’s what we learn from the stories or examples we heard. We feel good and confident in applying these simple narratives in the situations at hand even if the narrative we know is incomplete. If there are no direct relationships, our brain finds the matching substitution to fill the gap. This illusion of understanding makes us believe that when we understand the past, the future is also knowable. It feels good to believe that if we understand the past we can control the future. Getting deep to get the full knowledge of the story and including the consideration of uncertainty in the situation brings the feeling of anxiety.

Halo effect – Other than narrative fallacy we are also influenced by the halo effect. This is also called “first impression”. The Halo effect  builds causal relationships even if relationships do not exist. The halo effect influences how we interpret the information. One of Kahneman’s examples stuck to my mind – a successful CEO who is known as flexible, methodical, and decisive but when things start falling apart, his same characteristics are referred to as confused, rigid, and authoritarian.  

Role of uncertainty – We look for a simple message of triumph and failure that identifies the  cause. We forget the statistical fact of life : regression to the mean and nature’s rule of uncertainty. In hindsight everything makes sense with a fully understandable explanation. You can find many experts who can justify what happened yesterday was predictable, they have nice explanations to justify that it was predictable the day before yesterday. But reality is that the world is unpredictable. High subjective confidence shouldn’t be trusted as an indicator of accuracy of prediction, low confidence can be more informative because of the inclusion of uncertainty. Short term trends forecast and behaviors may work. Longer the time span, higher the probability of uncertainty and random events that can completely change the course.

Planning fallacy – We take risks when odds are favorable – we accept some probability of a costly failure because the probability of success seems sufficient. General tendency is to overestimate the benefits and underestimate the cost and loss probability. With this attitude, it is easy to become a victim of planning fallacy. Under such cases, estimates are made with best case scenarios in mind and miss the opportunity to analyze how many ways the project can fail. We feel confident about our plan until we face the first obstacle where it starts falling apart. Setting a baseline from the past cases of similar projects can be a savior here. If estimates are extremely rosy compared to previous similar projects, then there is a good chance that you are missing the full picture and you are not accounting for uncertainty and surprises you are going to face.

We take risks when odds are favorable – we accept some probability of a costly failure because the probability of success seems sufficient.

Can we rationalize it? – n an uncertain world, with fear of overconfidence, influence from the halo effect; can we still make a rational decision? Kahneman’s recommendation in a low validity environment is that we can maximize predictive accuracy if we leave final decisions to formulas. Research suggests that in such environments, these effects can be compensated by evaluating objective data instead of relying on confident subjective judgments. He discussed his work in redesigning the Israel military interview process. In the interview process, instead of making decisions based on the interviewer’s intuition, his team defined a set of traits and scored the candidate in each trait separately. Candidates with higher overall scores are selected. Decision is based on overall score instead of interviewers’ intuitive decisions produced better selection. 

When can we trust experts? – Expert’s judgements are better when the environment is sufficiently regular and if the expert has had opportunities to learn regularity patterns. Here associative machinery recognizes situations and can generate accurate predictions and decisions. Some examples for these situations are quick intuitive decisions made by experienced  firefighters or hospital nurses in emergencies. In a low-validity environment filled with noisy signals, we have the risk of “System 1” taking over to build unintended similarity relationships and make us perform irrationally. Bad predictions also happen when experts don’t have opportunities to learn regularities from experiences. Questions we have to ask: Does the situation happen in an environment with regularities? How much expertise does a person have? Did he/ she have the opportunity to experience various patterns of regularities? How much is the subjectivity in explaining the judgment?

Overconfidence is our belief that we know more than what we actually know. Limited experience can build confidence even in an unpredictable environment that is filled with noisy signals. The bias towards optimism supported by overconfidence leads to ignoring the obstacles we are going to face.