The planning fallacy is a form of cognitive bias affecting one’s ability to predict the time needed to complete a future task. Despite past experience suggesting longer timelines, individuals often underestimate the duration necessary to finish a project. Essentially, this type of optimism bias reflects a disconnect between forecasting and actual outcomes in decision-making and has been observed across different contexts, from everyday tasks to complex project management.
Psychologists Daniel Kahneman and Amos Tversky first identified this phenomenon. Their work within the field of psychology has been instrumental in understanding how people plan and make predictions about the future. The planning fallacy falls within their broader research on judgment under uncertainty and heuristics and biases.
Factors contributing to the planning fallacy include:
- Overconfidence: The belief that one can overcome obstacles more efficiently than others or than one has in the past.
- Motivational Reasons: Desires and motivations that color the planning process with undue optimism.
Typical outcomes of the planning fallacy involve:
- Missed Deadlines: Projects often get delivered later than initially planned.
- Budget Overruns: An optimistic forecast can lead to underestimating costs and resource allocation.
In addressing the planning fallacy, experts suggest employing more realistic forecasting methods and considering external factors and historical outcomes. Acknowledging the planning fallacy can improve decision-making and lead to more accurate project planning, ultimately reducing the discrepancy between expectations and reality.
Causes of the Planning Fallacy
Cognitive psychologists Daniel Kahneman and Amos Tversky first proposed the concept in the late 1970s. They identified several psychological mechanisms that contribute to the planning fallacy, which they documented and analyzed through empirical research.
Lovallo and Kahneman presented an enlarged definition in 2003, as the tendency to underestimate the time, costs, and hazards of future activities while overestimating their benefits. According to this description, the planning mistake causes not just time and expense overruns, but also benefit deficits.
Optimism bias plays a central role in the planning fallacy, where individuals present an unrealistic positive outlook on their tasks. They tend to take an inside view, focusing on the project at hand and disregarding historical performances in similar situations.
This bias leads to underestimation of the necessary resources or time required. Conversely, a pessimistic perspective might overcompensate, considering setbacks and complications to an excessive, often demotivating degree.
Kahneman and Tversky first described the fallacy as being caused by planners focusing on the most optimistic scenario for the task rather than using their full experience of how long similar tasks take. This insight was part of their broader investigation into systematic human cognitive biases and heuristics, which has had a profound impact on psychology, economics, and related fields.
Buehler and colleagues propose an explanation based on the self-serving bias that influences how people assess their prior performance. People can ignore historical evidence of how long a task should take by taking credit for successful efforts while blaming delays on external factors.
One investigation discovered that when respondents made predictions anonymously, they did not exhibit the optimistic bias. This implies that people make optimistic estimates in order to make a good impression on others, which is consistent with the ideas presented in impression management theory.
Roy and colleagues propose another explanation: people do not accurately recall how long it took to accomplish similar tasks in the past; instead, people routinely underestimate the duration of such earlier occurrences. As a result, predicting future event duration is influenced by memory of prior event duration. Roy and colleagues observe that this memory bias does not preclude other causes of the planning mistake.
Another possible explanation is the “authorization imperative”: most of project planning occurs in an environment where financial permission is required to proceed with the project, and the planner frequently has a vested interest in getting the project approved.
This dynamic may lead to the planner purposefully underestimating the project’s effort requirements. It is simpler to obtain forgiveness (for overruns) than permission (to begin the project if a reasonable effort estimate is provided). Jones and Euske refer to deliberate underestimate as “strategic misrepresentation”.
Empirical Evidence and Case Studies
The planning and construction of the Sydney Opera House serves as a textbook example of the planning fallacy. Initially estimated to be completed in 1963 with a budget of 7 million AUD, it concluded in 1973, overrunning its forecast by ten years and costing 102 million AUD. This underscores the tendency to underestimate both time and financial resources in large-scale projects.
The Denver International Airport opened in 1995, sixteen months later than planned, at a cost of $4.8 billion, more than $2 billion more than predicted.
Another egregious example is Berlin Brandenburg International Airport. After 15 years of preparation, work began in 2006, with an October 2011 launch date. There were several delays. It officially launched on October 31, 2020. The original budget was €2.83 billion, but current forecasts are close to €10.0 billion.
Big Dig Tunnel Project
For the Big Dig Tunnel megaproject in Boston, the original budget projection stood at 2.8 billion USD in 1982. The project rerouted the then-elevated Central Artery of Interstate 93, which ran through Boston, into the O’Neill Tunnel and constructed the Ted Williams Tunnel to connect Interstate 90 to Logan International Airport.
Upon its completion in 2007, expenditures had escalated beyond 14.6 billion USD. This case epitomizes the planning fallacy, as well as highlights the complexities involved in urban infrastructure development, which often leads to significant cost overruns and extended timelines.
Over 200 complaints were filed by the state of Massachusetts as a result of leaks, cost overruns, quality concerns, and safety violations. In total, the state has sought approximately $100 million from the contractors ($1 for every $141 spent).
Canadian Pacific Railway
The Canadian Pacific Railway provides a historical perspective on the planning fallacy. The railway, spanning British Columbia and intended to unite Canada from coast to coast, faced numerous obstacles and revisions from its inception.
The initial estimate and timeline were vastly exceeded, illustrating that the planning fallacy is not a modern phenomenon, but one that has affected projects throughout history.
The Planning Fallacy often causes project managers to set deadlines that are too optimistic. This can be seen when managers allocate time to future tasks without considering past experiences with similar projects.
These optimistic predictions disregard potential delays and ignore historical performance data. When these unrealistic timeframes are not met, project momentum and stakeholder confidence can suffer.
Similarly, cost overruns are a common consequence of the planning fallacy in project management. A tendency to underestimate costs and overlook risks during the planning phase often leads to budgets that are not reflective of true project needs.
Consequently, a reevaluation of resources might be necessary, which can impede the progress of the project and necessitate additional financial investment. The management of the project team, along with time management practices, play a crucial role in countering the negative impacts of these optimistic financial forecasts.
A 1997 poll of Canadian taxpayers revealed that they mailed in their tax forms roughly a week later than expected. They had no doubts about their track record of submitting forms, but they intended to do so more swiftly the following time. This exemplifies a key element of the planning fallacy: people acknowledge that their previous projections were overly optimistic while believing that their present predictions are realistic.
In a 1994 study, 37 psychology students were asked to estimate how long it would take to complete their senior thesis. The average projection was 33.9 days.
They also calculated how long it would take “if everything went as well as it possibly could” (average 27.4 days) and “if everything went as poorly as it possibly could” (average 48.6 days). The average actual completion time was 55.5 days, with around 30% of students completing their thesis within the timeframe they expected.
Social and Psychological Consequences
- Decreased Enthusiasm: Workers may become demotivated when projects continually miss deadlines due to poor time estimates.
- Increased Social Pressure: There can be considerable social pressure on project managers and team members to meet deadlines, which may compromise the quality of work or well-being of the workers.
- Impaired Decision-Making: Persistent underestimation of time can lead to poor decision-making in future projects as past experiences are disregarded.
Overcoming the Planning Fallacy
Taking an outside view is one effective strategy to overcome the planning fallacy. This approach involves assessing a project by considering the actual outcomes of similar tasks rather than solely relying on the project’s unique aspects. It forces one to look beyond the optimistic scenarios that are often anticipated.
For instance, a manager could analyze data from comparable projects to gauge a more accurate timeline for the current project, considering the resources and worst-case scenarios that impacted similar undertakings in the past.
Implementation intentions are specific plans that outline how, when, and where one will act. Various investigations have demonstrated that implementation goals help people become more aware of the broader task and see all potential outcomes. Initially, this makes predictions even more hopeful.
However, it is believed that creating implementation intentions “explicitly recruits willpower” by requiring the individual to commit to completing the activity. Those who established implementation intentions throughout the studies began working on the task sooner, experienced fewer interruptions, and had lower optimistic bias in subsequent predictions than those who did not. The reduction in optimistic bias was also found to be mediated by the reduction of interruptions.
Buffer Time Allocation
Strategic allocation of buffer time is another critical step in addressing the planning fallacy. Project planners should allocate extra time beyond what they initially think is necessary to account for unforeseen delays and issues. This can be done by adding a percentage of the original estimate as a buffer or by including additional weeks or months.
Assigning buffer time can help absorb the impact of unexpected events, ensuring that deadlines are met without a frantic rush, and possibly, with better quality outputs. Allocating this time is not an indication of poor planning but recognizing the inherent uncertainty in projects.
Time and Its Psychological Effects
When evaluating the impact of time on task performance, two critical concepts emerge: affective forecasting and the role of procrastination and deadlines. These psychological phenomena deeply influence how individuals predict and manage time for future tasks, often leading to underestimation and poor time management.
Affective forecasting is the process where individuals predict their future emotional states, which inevitably impacts their time predictions. Research shows that when people expect to feel good about completing a task, they are prone to underestimate the time required, anchoring too optimistically on favorable outcomes. This overly positive anticipation can skew time estimations for future tasks, disregarding potential obstacles.
Procrastination and Deadlines
Procrastination directly affects time management by delaying task initiation, which can create a disconnect between the planning and execution stages. Deadlines often serve as the only anchoring point, leading to last-minute efforts and rushed work. Here is a brief overview:
- Procrastination: Deliberate delay in starting or completing a task.
- Deadlines: Specified times by which a task must be completed.
Procrastination is a habitual tendency for some, but even those who typically manage time well can fall prey to it under certain conditions, such as when facing large, unstructured tasks. This behavior, coupled with the innate tendency to underestimate time, exacerbates the planning fallacy.
- Bent Flyvbjerg; Nils Bruzelius; Werner Rothengatter (2003). Megaprojects and Risk: An Anatomy of Ambition. Cambridge University Press. ISBN 978-0521009461
- Buehler, Roger; Dale Griffin; Michael Ross (1994). Exploring the “planning fallacy: Why people underestimate their task completion times”. Journal of Personality and Social Psychology. 67 (3): 366–381. doi:10.1037/0022-35220.127.116.116
- Buehler, Roger; Griffin, Dale, & Ross, Michael (2002). Inside the planning fallacy: The causes and consequences of optimistic time predictions. In Thomas Gilovich, Dale Griffin, & Daniel Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment, pp. 250–270. Cambridge, UK: Cambridge University Press. doi: 10.1017/CBO9780511808098.016
- Koole, Sander; Van’t Spijker, Mascha (2000). Overcoming the planning fallacy through willpower: Effects of implementation intentions on actual and predicted task-completion times. European Journal of Social Psychology. 30 (6): 873–888.
- Jones, Larry R; Euske, Kenneth J (October 1991). Strategic misrepresentation in budgeting. Journal of Public Administration Research and Theory. 1 (4): 437–460.
- Kahneman, D., & Tversky, A. (1977). Intuituve Prediction: Biases and Corrective Procedures. Decision Research, Perceptronics
- Lovallo, Dan; Kahneman, Daniel (July 2003). Delusions of Success: How Optimism Undermines Executives’ Decisions. Harvard Business Review. 81 (7): 56–63
- Pezzo, Stephanie P.; Pezzo, Mark V.; Stone, Eric R. (2006). The social implications of planning: How public predictions bias future plans. Journal of Experimental Social Psychology. 2006 (2): 221–227. doi: 10.1016/j.jesp.2005.03.001
- Roy, Michael M.; Christenfeld, Nicholas J. S.; McKenzie, Craig R. M. (2005). Underestimating the Duration of Future Events: Memory Incorrectly Used or Memory Bias?. Psychological Bulletin. 131 (5): 738–756
- Sanna, Lawrence J.; Parks, Craig D.; Chang, Edward C.; Carter, Seth E. (2005). The Hourglass Is Half Full or Half Empty: Temporal Framing and the Group Planning Fallacy. Group Dynamics: Theory, Research, and Practice. 9 (3): 173–188. doi: 10.1037/1089-2618.104.22.168