Planning fallacy

Daniel Kahneman who, along with Amos Tversky, proposed the fallacy

The planning fallacy, first proposed by Daniel Kahneman and Amos Tversky in 1979,[1][2] is a phenomenon in which predictions about how much time will be needed to complete a future task display an optimism bias (underestimate the time needed).

This phenomenon occurs regardless of the individual's knowledge that past tasks of a similar nature have taken longer to complete than generally planned.[3][4][5] The bias only affects predictions about one's own tasks; when outside observers predict task completion times, they show a pessimistic bias, overestimating the time needed.[6][7] The planning fallacy requires that predictions of current tasks' completion times are more optimistic than the beliefs about past completion times for similar projects and that predictions of the current tasks' completion times are more optimistic than the actual time needed to complete the tasks. In 2003, Lovallo and Kahneman proposed an expanded definition as the tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits of the same actions. According to this definition, the planning fallacy results in not only time overruns, but also cost overruns and benefit shortfalls.[8]

Empirical evidence

For individual tasks

In a 1994 study, 37 psychology students were asked to estimate how long it would take to finish their senior theses. The average estimate was 33.9 days. They also estimated how long it would take "if everything went as well as it possibly could" (averaging 27.4 days) and "if everything went as poorly as it possibly could" (averaging 48.6 days). The average actual completion time was 55.5 days, with only about 30% of the students completing their thesis in the amount of time they predicted.[9]

Another study asked students to estimate when they would complete their personal academic projects. Specifically, the researchers asked for estimated times by which the students thought it was 50%, 75%, and 99% probable their personal projects would be done.[7]

A survey of Canadian tax payers, published in 1997, found that they mailed in their tax forms about a week later than they predicted. They had no misconceptions about their past record of getting forms mailed in, but expected that they would get it done more quickly next time.[10] This illustrates a defining feature of the planning fallacy; that people recognize that their past predictions have been over-optimistic, while insisting that their current predictions are realistic.[6]

For group tasks

Carter and colleagues conducted three studies in 2005 that demonstrate empirical support that the planning fallacy also affects predictions concerning group tasks. This research emphasizes the importance of how temporal frames and thoughts of successful completion contribute to the planning fallacy.[11]

Additional studies

Bent Flyvbjerg and Cass Sunstein argue that Albert O. Hirschman's Hiding Hand principle is the planning fallacy writ large, and they tested the empirical validity of the principle.[12] See also further reading below for additional studies.

Proposed explanations

Methods for counteracting

Segmentation effect

The segmentation effect is defined as the time allocated for a task being significantly smaller than the sum of the time allocated to individual smaller sub-tasks of that task. In a study performed by Forsyth in 2008, this effect was tested to determine if it could be used to reduce the planning fallacy. In three experiments, the segmentation effect was shown to be influential. However, the segmentation effect demands a great deal of cognitive resources and is not very feasible to use in everyday situations.[18]

Implementation intentions

Implementation intentions are concrete plans that accurately show how, when, and where one will act. It has been shown through various experiments that implementation intentions help people become more aware of the overall task and see all possible outcomes. Initially, this actually causes predictions to become even more optimistic. However, it is believed that forming implementation intentions "explicitly recruits willpower" by having the person commit themselves to the completion of the task. Those that had formed implementation intentions during the experiments began work on the task sooner, experienced fewer interruptions, and later predictions had reduced optimistic bias than those who had not. It was also found that the reduction in optimistic bias was mediated by the reduction in interruptions.[5]

Reference class forecasting

Reference class forecasting predicts the outcome of a planned action based on actual outcomes in a reference class of similar actions to that being forecast.

Real world examples

The Sydney Opera House

The Sydney Opera House was expected to be completed in 1963. A scaled-down version opened in 1973, a decade later. The original cost was estimated at $7 million, but its delayed completion led to a cost of $102 million.[11]

Eurofighter Typhoon

The Eurofighter Typhoon defense project took six years longer than expected, with an overrun cost of 8 billion Euros.[11]

Boston's Central Artery/Tunnel

The Boston Central Artery was completed seven years later than planned costing another $12 billion.[19]

Denver International Airport

The Denver International Airport opened sixteen months later than scheduled with a total cost of $4.8 billion; over $2 billion more than expected.[20]

See also

Notes

  1. 1 2 3 Pezzo, Mark V.; Litman, Jordan A.; Pezzo, Stephanie P. (2006). "On the distinction between yuppies and hippies: Individual differences in prediction biases for planning future tasks". Personality and Individual Differences 41 (7): 1359–1371. doi:10.1016/j.paid.2006.03.029. ISSN 0191-8869.
  2. Kahneman, Daniel; Tversky, Amos (1979). "Intuitive prediction: biases and corrective procedures". TIMS Studies in Management Science 12: 313–327.
  3. "Exploring the Planning Fallacy" (PDF). Journal of Personality and Social Psychology. 1994. Retrieved 7 November 2014.
  4. "If you don't want to be late, enumerate: Unpacking Reduces the Planning Fallacy". Journal of Experimental Social Psychology. 15 October 2003. Retrieved 7 November 2014.
  5. 1 2 "Overcoming the Planning Fallacy Through Willpower". European Journal of Social Psychology. November 2000. Retrieved 22 November 2014.
  6. 1 2 3 Buehler, Roger; Griffin, Dale, & Ross, Michael (2002). "Inside the planning fallacy: The causes and consequences of optimistic time predictions". In Thomas Gilovich, Dale Griffin, & Daniel Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment, pp. 250–270. Cambridge, UK: Cambridge University Press.
  7. 1 2 Buehler, Roger; Dale Griffin; Michael Ross (1995). "It's about time: Optimistic predictions in work and love". European Review of Social Psychology (American Psychological Association) 6: 1–32. doi:10.1080/14792779343000112.
  8. Lovallo, Dan; Daniel Kahneman (July 2003). "Delusions of Success: How Optimism Undermines Executives' Decisions". Harvard Business Review: 56–63.
  9. Buehler, Roger; Dale Griffin; Michael Ross (1994). "Exploring the "planning fallacy": Why people underestimate their task completion times". Journal of Personality and Social Psychology (American Psychological Association) 67 (3): 366–381. doi:10.1037/0022-3514.67.3.366.
  10. Buehler, Roger; Dale Griffin; Johanna Peetz (2010). "The Planning Fallacy: Cognitive, Motivational, and Social Origins" (PDF). Advances in Experimental Social Psychology (Academic Press) 43: 9. doi:10.1016/s0065-2601(10)43001-4. Retrieved 2012-09-15.
  11. 1 2 3 4 "The Hourglass Is Half Full or Half Empty: Temporal Framing and the Group Planning Fallacy". Group Dynamics: Theory, Research, and Practice. September 2005. Retrieved 22 November 2014.
  12. Flyvbjerg, Bent; Sunstein, Cass R. (2015). "The Principle of the Malevolent Hiding Hand; or, the Planning Fallacy Writ Large". Rochester, NY.
  13. Stephanie P. Pezzoa. Mark V. Pezzob, and Eric R. Stone. "The social implications of planning: How public predictions bias future plans" Journal of Experimental Social Psychology, 2006, 221–227
  14. "Underestimating the Duration of Future Events: Memory Incorrectly Used or Memory Bias?". American Psychological Association. September 2005. Retrieved 21 November 2014.
  15. "Focalism: A source of durability bias in affective forecasting.". American Psychological Association. May 2000. Retrieved 21 November 2014.
  16. Jones,, Larry R; Euske, Kenneth J (October 1991). "Strategic misrepresentation in budgeting". Journal of Public Administration Research and Theory (Oxford University Press) 1 (4): 437–460. Retrieved 11 March 2013.
  17. Taleb, Nassem (2012-11-27). Antifragile: Things That Gain from Disorder. ISBN 978-1-4000-6782-4.
  18. "Allocating time to future tasks: The effect of task segmentation on planning fallacy bias". Memory & Cognition. June 2008. Retrieved 7 November 2014.
  19. "No Light at the End of his Tunnel: Boston's Central Artery/Third Harbor Tunnel Project". Project on Government Oversight. 1 February 1995. Retrieved 7 November 2014.
  20. "Denver International Airport" (PDF). United States General Accounting Office. September 1995. Retrieved 7 November 2014.

References

Further reading

This article is issued from Wikipedia - version of the Wednesday, April 06, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.