Economics, Policy

354. BCA criticisms 1: “any result you want”

Over the years, I’ve had a number of conversations with people who made negative comments about Benefit: Cost Analysis (BCA) or certain aspects of it. In addition, there are various published critiques of BCA. In my view, some of the criticisms offered are not unreasonable, but some are off the beam. If one is doing BCAs, it is worth understanding the criticisms you are likely to encounter so that you are prepared for conversations about them and know what to do (if anything) to address them. (There is a video version of the post below, if you prefer.)

I thought it would be useful to do a series of occasional posts about criticisms of BCA. If confronted with a criticism, how would I respond?

The criticism for today is the comment that BCA can generate any result you want, so no BCA result can be trusted.

There is some truth in the first half of this comment. There clearly is potential for a BCA to be manipulated to get a preferred result, and it does happen sometimes. If the process is not done with integrity, BCA can be a mechanism for turning preconceptions into foregone conclusions.

On the other hand, this situation is not specific to BCA. Any type of analysis of a decision can be manipulated if that’s what the people involved want; Multi-Criteria Analysis, Decision Analysis, Risk Analysis, Stakeholder Analysis – they are all susceptible to this criticism.

So it is not really a criticism of BCA itself. It’s up to the analysts and the decision makers to take steps to protect the integrity of the analysis. Any useful tool can be misused. If that happens, criticise the people rather than the tool.

Here are some strategies and techniques that can contribute to producing a good BCA that can be defended and is clearly not generating a pre-determined result.

The fundamental issue is that the people involved need to have a commitment to producing an honest, balanced and informative BCA. Ideally, the decision makers would signal to the analysts that they want this. That way, the people involved are more likely to exercise the required due diligence. Clearly, this doesn’t always happen. Sometimes, decision makers have preconceptions or preferences and they are not particularly interested in testing them against an honest analysis. If they are interested in getting a BCA done, it might just be because they expect it to confirm their preconceptions, providing support for a decision they want to make or have already made. This could be the fundamental source of the criticism that this post is about.

Where people want a particular result from a BCA, most commonly what they want is a really positive result. This, combined with the well-known tendency for people to be overly optimistic about projects they are responsible for (the “planning fallacy“), means we need strategies to reduce the chances that benefits will be exaggerated or costs and risks under-stated.

One key strategy is transparency. The analyst needs to be transparent about what numbers have been used in the analysis, the sources of the numbers, what judgements and adjustments were made to the numbers and why, and the quality of the information that has been used. Then, if the analyst has been too optimistic, this can be detected and feedback provided. Transparency is directly useful to the decision makers who genuinely want to make a good decision, but it is also essential to facilitate an independent review of the analysis.

In my view, independent review is essential for any BCA that is being used to inform an important decision. The reviewer(s) should look at the various items in my list of things that need to be transparent, compare the methods used with established best practice, and provide feedback. The analyst should then modify the BCA accordingly, or provide strong reasons for not doing so. The decision makers should look at the reviews and the analysts’ responses to help them form a judgement about the quality and reliability of the work.

The analyst should aspire to use the best available information to feed into the BCA. Some effort should be made to find the best information, rather than quickly being satisfied with poorly informed estimates. The best information may not be all that strong, and if so, we may still use it, but we need to explain why it is the best available.

One solution to over-optimism in project planning suggested by Daniel Kahneman (who did research on it and named it the planning fallacy) is to take the “outside view”. This basically means using observations of benefits or costs for similar projects that have already been completed. See Flyvberg (2013) for more on this.

If the lack of information is too extreme and the decision is not clear cut, we could recommend investing in the collection of better information before committing to a decision.

Another key to avoiding manipulation of results is good-quality sensitivity analysis. It should be done in a way that helps the decision makers get a feel for how robust the conclusions to be drawn from the BCA can be.

It seems obvious that delivering these strategies will require sufficient time and resources. Without these, the chances of the analyst producing a high-quality BCA will be greatly diminished. It is up to the decision makers who commission a BCA to be realistic about the time and resources required to do a good job.

Other BCA Criticisms

  1. Any result you want
  2. Too much uncertainty
  3. Discounting is bad
  4. Not fair
  5. Money isn’t everything

Video version of this blog post




Playlist of videos for all five BCA Criticisms

Brief introductory course on BCA

If you are new to BCA, here is a very introductory course, in 10 videos totalling 1 hour 14 minutes.

Comprehensive set of courses on Applied BCA

I invite you to consider enrolling in my three 4-week online courses on Applied Benefit: Cost Analysis, covering The Essentials, Measuring Benefits and Practical Issues. Build your BCA expertise and gain the practical skills you need to undertake a complex Benefit: Cost Analysis. Online: high-quality video lectures and interviews, live workshops. No existing economics background required.

Discounts are available for bulk enrolments and for enrolments from selected countries. Bursaries (with a discount of 90%) are also available for enrolments from selected countries.

For information on content, pricing, and how to apply for discounts or a bursary download this flier:
Applied BCA Flier v8 . For details of the content of each course, see PD385.

The course runs starting in late February and late July each year. To enrol, click here or on the image at left, and go to the Business and Commerce section.

“A fantastic program and course. It is by far the best course I have been involved in. I learned so much and there is still lots to learn which you have shared with links so I can return to refresh/learn as needed.”

“Essential for new BCA users.”

“Taught in an engaging way with many real-world examples.”

Further reading

Flyvberg, B. (2013). Quality Control and Due Diligence in Project Management: Getting Decisions Right by Taking the Outside View, International Journal of Project Management, 31(5), 760-774. Full paper

Pannell, D.J. (1997). Sensitivity analysis of normative economic models: Theoretical framework and practical strategies. Agricultural Economics 16: 139-152. Full paper (100 K)