What Is Hindsight Bias?

The tumor that appeared on a second scan. The fourth-quarter comeback to win the game. The guy in human resources who was secretly accepting bribes. The situation may vary each time, but we hear ourselves say it over and over again: “I knew it all along.”

The problem is that too often we actually didn’t know it all along, we only feel as though we did. The phenomenon, which researchers refer to as “hindsight bias,” is one of the most widely studied decision traps and has been documented in various domains, including medical diagnoses, accounting and auditing decisions, athletic competition, and political strategy.

Hindsight bias, also known as the knew-it-all-along effect or creeping determinism, is the inclination, after an event has occurred, to see the event as having been predictable, despite there having been little or no objective basis for predicting it. It is a multifaceted phenomenon that can affect different stages of designs, processes, contexts, and situations.

Hindsight bias may cause memory distortion, where the recollection and reconstruction of content can lead to false theoretical outcomes. It has been suggested that the effect can cause extreme methodological problems while trying to analyze, understand, and interpret results in experimental studies.

Examples are present in the writings of historians describing outcomes of battles, physicians recalling clinical trials, and in judicial systems trying to attribute responsibility and predictability of accidents.

New Perspectives on Hindsight

In an article in the September 2012 issue of Perspectives on Psychological Science, Neal Roese of the Kellogg School of Management at Northwestern University and Kathleen Vohs of the Carlson School of Management at the University of Minnesota reviewed the existing research on hindsight bias, exploring the various factors that make us so susceptible to the phenomenon and identifying a few ways we might be able to combat it. It is one of the first overviews to draw insights together from across different disciplines.

Roese and Vohs propose that there are three levels of hindsight bias that stack on top of each other, from basic memory processes up to higher-level inference and belief.

  • The first level of hindsight bias, memory distortion, involves misremembering an earlier opinion or judgment (“I said it would happen”).

  • The second level, inevitability, centers on our belief that the event was inevitable (“It had to happen”).

  • And the third level, foreseeability, involves the belief that we personally could have foreseen the event (“I knew it would happen”).

The researchers argue that certain factors fuel our tendency toward hindsight bias. Research shows that we selectively recall information that confirms what we know to be true and we try to create a narrative that makes sense out of the information we have.

When this narrative is easy to generate, we interpret that to mean that the outcome must have been foreseeable. Furthermore, research suggests that we have a need for closure that motivates us to see the world as orderly and predictable and to do whatever we can to promote a positive view of ourselves.

Ultimately, hindsight bias matters because it gets in the way of learning from our experiences.

“If you feel like you knew it all along, it means you won’t stop to examine why something really happened,” observes Roese. “It’s often hard to convince seasoned decision makers that they might fall prey to hindsight bias."


The hindsight bias, although not thitherto named as such, was not a new concept when it emerged in psychological research in the 1970s. In fact, it had been indirectly described numerous times by historians, philosophers, and physicians.

In 1973, Baruch Fischhoff attended a seminar where Paul E. Meehl stated an observation that clinicians often overestimate their ability to have foreseen the outcome of a particular case, as they claim to have known it all along. Baruch, a psychology graduate student at the time, saw an opportunity in psychological research to explain these observations.

In the early seventies, investigation of heuristics and biases was a large area of study in psychology, led by Amos Tversky and Daniel Kahneman. Two heuristics identified by Tversky and Kahneman were of immediate importance in the development of the hindsight bias; these were the availability heuristic and the representativeness heuristic.

In an elaboration of these heuristics, Beyth and Fischhoff devised the first experiment directly testing the hindsight bias. They asked participants to judge the likelihood of several outcomes of US president Richard Nixon’s upcoming visit to Beijing (then romanized as Peking) and Moscow.

Some time after president Nixon’s return, participants were asked to recall (or reconstruct) the probabilities they had assigned to each possible outcome, and their perceptions of likelihood of each outcome was greater or overestimated for events that actually had occurred. This study is frequently referred to in definitions of the hindsight bias, and the title of the paper, “I knew it would happen”, may have contributed to the hindsight bias being interchangeable with the phrase “knew it all along” hypothesis.

In 1975, Fischhoff developed another method for investigating the hindsight bias, which at the time was referred to as the “creeping determinism hypothesis”. This method involves giving participants a short story with four possible outcomes, one of which they are told is true, and are then asked to assign the likelihood of each particular outcome.

Participants frequently assign a higher likelihood of occurrence to whichever outcome they have been told is true. Remaining relatively unmodified, this method is still used in psychological and behavioural experiments investigating aspects of the hindsight bias.

Having evolved from the heuristics of Tversky and Kahneman into the creeping determinism hypothesis and finally into the hindsight bias as we now know it, the concept has many practical applications and is still at the forefront of research today. Recent studies involving the hindsight bias have investigated the effect age has on the bias, how hindsight may impact interference and confusion, and how it may affect banking and investment strategies.

Affected Bias

Hindsight bias is not only affected by whether or not the outcome is favorable or unfavorable, but also the severity of the negative outcome. In malpractice suits, the more severe the negative outcome the more dramatic the juror’s hindsight bias.

In a perfectly objective case, the verdict would be based on the physician’s standard of care instead of the outcome of the treatment; however, studies show that cases that end in severe negative outcomes such as death result in higher levels of hindsight bias.

In 1996, LaBine proposed a scenario where a psychiatric patient told a therapist that they were contemplating harming another individual who the therapist did not warn of possible danger.

Three participants were given three possible outcomes where the threatened individual received no injuries, minor injuries, and serious injuries and then were asked to determine if the doctor would be considered negligent. Participants who received the serious injuries category not only rated the therapist as negligent but also rated the attack as more foreseeable. Participants in the no injuries and minor injury categories were more likely to see the therapist’s actions as reasonable.

In tests for hindsight bias, a person is asked to remember a specific event from the past or recall some descriptive information that they had been tested on earlier. In between the first test and final test, they are given the correct information about the event or knowledge.

At the final test, he or she will report that they knew the answer all along when they truly have changed their answer to fit with the correct information they were given after the initial test. Hindsight bias has been found to take place in both memory for experienced situations (events that the person is familiar with) and hypothetical situations (made up events where the person must imagine being involved).

More recently, it has been found that hindsight bias also exists in recall with visual material. When tested on initially blurry images, the subjects learn what the true image was after the fact and they would then remember a clear recognizable picture. There has been very little research on the phenomenon of visual hindsight bias.

One experiment performed by Muhm et al. occurred over a six-year period and had over 4,618 participants. Each participant received a chest radiograph every 4 months.

Each radiograph was reviewed by two radiologists and a respiratory physician to determine if there were any problems. Over the course of the experiment, 92 chest tumors were found in several of the participants.

When physicians reviewed the previous radiographs of the participants who developed tumors, they determined that evidence of the tumor was present even before it had been identified. In other words, after finding the tumor, physicians determined the presence of the tumor was obvious in previous radiographs, even though they had not noticed it before.

While our inclination to believe that we “knew it all along” is often harmless, it can have important consequences for the legal system, especially in cases of negligence, product liability, and medical malpractice. Studies have shown, for example, that hindsight bias routinely afflicts judgments about a defendant’s past conduct.

Considering the Opposite

And technology may make matters worse.

“Paradoxically, the technology that provides us with simplified ways of understanding complex patterns – from financial modeling of mortgage foreclosures to tracking the flow of communications among terrorist networks – may actually increase hindsight bias,” says Roese.

So what, if anything, can we do about it?

Roese and Vohs suggest that considering the opposite may be an effective way to get around our cognitive fault, at least in some cases.

When we are encouraged to consider and explain how outcomes that didn’t happen could have happened, we counteract our usual inclination to throw out information that doesn’t fit with our narrative. As a result, we may be able to reach a more nuanced perspective of the causal chain of events.