Latest

416. Neglecting the risks of a project

There is a well-documented trend for people to neglect downside risks when developing and evaluating a new project. This is part of a general tendency for people to be overly optimistic about new projects, including over-stating the likely benefits, under-stating the costs, and neglecting risks that could cause the project to fail. 

Kahneman calls this tendency the “planning fallacy”, but more commonly it’s referred to as “optimism bias”, which to me seems a better description of it.

I’ve observed optimism bias in action many times, particularly when I’ve reviewed the assumptions people have made for a Benefit-Cost Analysis of a potential new project. I’ve done many such reviews, and in almost every case I’ve identified assumptions that seem too optimistic, sometimes much too optimistic. The probability that the project in question might fail is often set at a level that seems too low, particularly when you look at how often projects do fail in practice.

As an agricultural example, suppose that Dave the risk-averse farmer and a group of his farmer friends is considering investing in a new processing plant in the wheatbelt to turn lupin seeds into flour for human consumption. They are inspired by the idea of adding value to the seeds they produce, and by the potential to create jobs in their local town. They prepare a business plan based on what they consider to be realistic assumptions, and this indicates that the new plant will be highly profitable.

However, when a business consultant comes in and starts asking hard questions, their excitement starts to fade. It turns out that they’ve failed to consider a set of risks that could easily affect their project, perhaps including the following.

  • The equipment that they are planning to use was developed for other grains might not work as well for lupins.
  • The initial market research that has been done might not translate into actual demand for the product. After all, lupin flour is quite different from the flours that consumers are used to.
  • The prices of other grains might increase, causing farmers to substitute their production away from lupins, meaning that there is not sufficient supply of lupin seed.
  • It might prove difficult to get sufficient labour to operate the plant in the small town where they are planning to set up the plant.
  • They might not be able to attract an experienced and skilled manager to live in the town.
  • The local water supply, which comes from a small local groundwater aquifer, might not be able to cope with the increased usage of water required for the plant, resulting in additional costs to supply water in some other way.

Kahneman provides some remarkable examples of optimism bias in action. I’ve quoted the following two examples quite often.

  • In 1997, the budget for a new Scottish Parliament building was set at £40 million. As the building was constructed over the next seven years, the actual cost blew out and the budget increased almost every year until finally the project cost £431 million. The original decision to proceed with the project was based on costs that were too small by a factor of more than 10.
  • In 2005, a study of all rail projects conducted worldwide over 30 years found that planners had over-estimated passenger numbers by an average of 106% and under-estimated project costs by 45%. On average, decisions about these projects were based on assumptions that produced a Benefit: Cost Ratio that was more than four times too high.

From my experience, it seems that people who are developing a new project become personally invested in it. They get excited about the prospects and they really want it to succeed. They also feed off each other – if everybody else around the table is excited about the project, it’s easier to get excited yourself.

They may also be affected by “anchoring”, the phenomenon where, once people have heard an estimate for a number, their own estimate for that number tends to be closer to the original estimate than it would have been if they’d never heard that estimate. If someone makes an optimistic statement about the project, it becomes harder for others to be realistic about the risks of failure. Anchoring seems to be a phenomenon to which we are highly prone, and it is quite difficult to counter.

Strategies to reduce optimism bias could include the following.

  • Explicitly focus on the risks that the project could fail. Brainstorm possible reasons for failure.
  • Look at examples of other similar projects and learn what has gone wrong. What proportion of them failed?
  • Bring in an external reviewer who isn’t so committed to the success of the project and get them to review your assumptions about the project.
  • Try to avoid anchoring. For example, if multiple people are making estimates of benefits, costs or risks, get them to do so independently, without seeing what the other people are coming up with. Then share the results, discuss reasons for variation within the group of people, and allow them to update their estimates. This is the approach taken in the IDEA protocol for eliciting information from a group of experts (Hemming et al. 2017).

Further reading

Kahneman, D. (2011). Thinking, Fast and Slow, Farrar, Straus and Giroux, New York.

Hemming, V., Burgman, M.A., Hanea, A.M., McBride, M.F., and Wintle, B.C. (2017). A practical guide to structured expert elicitation using the IDEA protocol, Methods in Ecology and Evolution 9, 169-180. Full paper

This is #10 in my RiskWi$e series. Read about RiskWi$e here or here.

The RiskWi$e series:

405. Risk in Australian grain farming
406. Risk means probability distributions
408. Farmers’ risk perceptions
409. Farmers’ risk preferences
410. Strategic decisions, tactical decisions and risk
412. Risk aversion and fertiliser decisions
413. Diversification to reduce risk
414. Intuitive versus analytical thinking about risk
415. Learning about the riskiness of a new farming practice
416. Neglecting the risks of a project (this post)
418. Hedging to reduce crop price risk
419. Risk premium
420. Systematic decision making under risk