One of your critical projects is behind schedule and over budget. In a review meeting, your boss asks if it’s worth replacing the project manager or moving to a new technology platform to minimize the cost overrun and increase the likelihood of completing on time. After quick deliberation, you make a recommendation to leave the project manager and platform in place.
Our brains are constantly required to make complex decisions like this one rapidly. Despite the massive amount of available computing power, data and analytics, many organizational leaders continue to make critical business decisions based on intuition and speed, rather than robust analysis. They constantly face the need to make these decisions quickly to grow their business. They become overwhelmed, reaching the limits of their ability to process information and trade-offs, and therefore start to make decisions using their “gut feeling.”
The problem with averages
According to Jim Hare, research director at Gartner, even business executives who embrace data-driven decision making often rely on simple measurements like point estimates and averages. The problem with using averages is that uncertain outcomes cannot be represented by a single number. Instead, they require a probability distribution with a certain shape and range of possibilities — some outcomes being more likely than others, and some estimates commanding confidence than others.
Mr. Hare points to the 1997 Red River Flood in North Dakota and Minnesota as an example of where using advanced analytics and robust analysis could have helped prevent disaster. The U.S. National Weather Service (NWS) forecast that the river level would peak at 49 feet (the same height as the dike built using a 100-year average). This single number lured people living in nearby towns into a false sense of security, but the level rose to 54 feet and caused a major disaster.
Advanced analytics, such as probability simulation, would have shown that the flood crest was not an absolute number but a range of numbers — a distribution of the likelihood of occurrence (frequency) rather than a single value. These advanced analytics programs simulate uncertainty, generating thousands of possible values for a given scenario replacing the low-resolution “snapshot” of a single average number with a high-resolution “movie” showing a whole range of possible outcomes.
“Just as important is that leaders understand how to use probability distributions to reduce risk in making their decisions and avoid the temptation to use a single number — the dreaded “average,’” said Mr. Hare. “Just because a possible outcome has a lower probability doesn’t mean it can’t occur. In the case of the Red River Flood, NWS should have communicated a probability distribution with the range of possible river levels (rather than an average) and flood management officials should have built their preparation plans for the lower probability, but very possible, higher crest.”
Read relevant complimentary research: Use advanced analytics to reduce the risk of decisions.
The practice of advanced analytics is less about data and more about reducing the risk of decisions. While many organizations are still moving from basic reporting to predictive analytics, the next wave of investments will be in prescriptive analytics to improve decision making. However, analytics and simulation alone will not be enough — the decision process will also have to change, with people learning new skills and new ways to make decisions. The transformation must be organizational, as well as technological, and the change will have to come from the top.