Deciding How to Decide
Like so much in business, the conventional wisdom about decision-making is not driven by science, but by anecdote. The result? In a survey of 2,207 executives, McKinsey found few believed their company’s decision-making was good:
"Only 28% said that the quality of strategic decisions in their companies was generally good, 60% thought that bad decisions were about as frequent as good ones, and the remaining 12% thought good decisions were altogether infrequent. Our candid conversations with senior executives behind closed doors reveal a similar unease with the quality of decision-making and confirm the significant body of research indicating that cognitive biases affect the most important strategic decisions made by the smartest managers in the best companies. Mergers routinely fail to deliver the expected synergies. Strategic plans often ignore competitive responses. And large investment projects are over budget and over time — over and over again."
If those sentiments are valid, the cost of bad decisions are unimaginably large.
Let's look at some of the myths about decision-making, and explore what behavioral science says.
You May Think You're Making Rational Decisions
At a 2016 conference, Daniel Kahneman, the ‘father’ of behavioral economics and Nobel Laureate, summed up the state of decision-making in organizations:
"You look at large organizations that are supposed to be optimal, rational. And the amount of folly in the way these places are run, the stupid procedures that they have, the really, really poor thinking you see all around you, is actually fairly troubling."
And when contrasted with the rational process people feel they are going through in making decisions, the actual mess of routine decision-making leads to many poor decisions. Again Kahneman:
"We’re fundamentally over-confident in the sense that we jump to conclusions — and to complete, coherent stories — to create interpretations. So we misunderstand situations, spontaneously and automatically. And that’s very difficult to control."
Some of that messiness is due to cognitive biases, such as the bias to action. In a famous example, a groups of economics researcher studied goalkeepers’ actions during penalty kicks, when they have to decide to leap left or right or to stay in the middle of the goal.
"Given the probability distribution of kick direction, the optimal strategy for goalkeepers is to stay in the goal’s center. Goalkeepers, however, almost always jump right or left."
However, the goalies almost always jump left or right.
The researchers concluded that the goalies have collectively fallen for a bias toward action, since they feel worse when a goal is scored when they sit tight than when a goal is scored when they jump, because jumping is the norm.
This is, as the researchers noted, "particularly striking since the goalkeepers have huge incentives to make correct decisions, and it is a decision they encounter frequently."
Note that this research has not made any difference to what professional goalkeepers do. And this is only one of a long list of biases making decision-making a minefield.
Some of the messiness is attributable to what Kahneman calls ‘noise’:
"A lot of the errors that people make is simply noise, in the sense that it’s random, unpredictable, it cannot be explained."
In a Harvard Business Review article Kahneman and colleagues describe noise in this way:
"Professionals in many organizations are assigned arbitrarily to cases: appraisers in credit-rating agencies, physicians in emergency rooms, underwriters of loans and insurance, and others. Organizations expect consistency from these professionals: Identical cases should be treated similarly, if not identically. The problem is that humans are unreliable decision makers; their judgments are strongly influenced by irrelevant factors, such as their current mood, the time since their last meal, and the weather. We call the chance variability of judgments noise. It is an invisible tax on the bottom line of many companies."
Noise is what leads professional insurance underwriters to vary in their work by as much as 40%, and why we always get a second diagnosis for a health condition. Results may vary based on who’s making the decision and their state of mind at the time.
But bias and noise aren’t the only barriers to effective decision-making. A number of myths are built into business culture that act as pitfalls, making it harder to find good outcomes to complex problems. Here are three of the most common myths.
Related Article: Did Leaders' Decision-Making Improve During Lockdown?
Myth #1: Fast Beats Slow
Perhaps the most common myth is that it is better to make decisions quickly.
This quote by Paul Rogers and Marcis Blenko is an example of conventional wisdom that conflates quick decision-making with optimal decision-making:
"Making good decisions and making them happen quickly are the hallmarks of high-performing organizations. When we surveyed executives at 350 global companies about their organizational effectiveness, only 15% said that they have an organization that helps the business outperform competitors. What sets those top performers apart is the quality, speed, and execution of their decision making."
3 Secrets to Accelerating Transformation to Improve CX + EX
Learn about force multipliers that will reduce technical debt and grow revenue while reducing costs
Why Knowledge Management Is Critical to Business Resiliency
How Organizations are Future-Proofing Business by Harnessing Company and Employee Knowledge
This myth is so deeply wired into business culture, and is so hard to counter, that I’ve dedicated the third post in this series, "You Have To Go Slow To Go Fast," to examine it in detail.
The key idea to take away is that we are wired to make decisions emotionally, on the fly, and then to erect a story to justify the decision. We believe our stories are rational when science shows we aren’t. The fundamental solution is to create ‘choice structures’ that slow decision-making. As Pronita Mehrotra, Anu Arora and Sandeep Krishnamurthy describe it,
"While mantras like 'move fast and break things' can help push people towards action, they can backfire when the underlying problem is complex. In such situations, resisting the temptation to find a solution quickly (and often less creatively), and instead urging the team (much to their frustration) to keep searching for more ideas can lead to more innovative and far-reaching solutions .... To avoid premature closure, teams should arrive at an 'almost final' decision and then intentionally delay action in favor of additional incubation time."
To outsmart our foolish brains, we need to keep the process of deciding open as long as more information, more incubation of ideas, and more looking at alternatives is possible. Only then should we make a final decision. Even when decisions must be made quickly — as in emergencies — at the very least run through a checklist of minimal steps to be taken. For example, a quick sanity check: is this decision reversible? If so, relatively low risk. However, if the decision is a one-way door into an unknown future, it’s best to proceed cautiously, and to use the time available to make as slow a decision as time allows.
Related Article: The Speed of Work Today: More, Faster, Now
Myth #2: Brainstorming Helps
Brainstorming is one of the most overhyped activities in business, and is not a great answer to the questions surrounding decision-making. As Tomas Chamorro-Premuzic summarizes:
"After six decades of independent scientific research, there is very little evidence for the idea that brainstorming produces more or better ideas than the same number of individuals would produce working independently. In fact, a great deal of evidence indicates that brainstorming actually harms creative performance, resulting in a collective performance loss that is the very opposite of synergy."
'Minimal brainstorming,' where people work independently before group brainstorming, does improve outcomes over individual work, somewhat. But otherwise, it is a poor onramp for making decisions.
Myth #3: Adding More People Helps, Especially Smart People
One of the most counterintuitive findings in my experience is this: Adding more intelligent people to a group does not lead to better problem solving, but adding ‘socially sensitive’ people helped:
"Collective intelligence, the researchers believe, stems from how well the group works together. For instance, groups whose members had higher levels of 'social sensitivity' were more collectively intelligent. 'Social sensitivity has to do with how well group members perceive each other's emotions,' says Christopher Chabris, a co-author and assistant professor of psychology at Union College in New York. 'Also, in groups where one person dominated, the group was less collectively intelligent than in groups where the conversational turns were more evenly distributed,' adds Woolley. And teams containing more women demonstrated greater social sensitivity and in turn greater collective intelligence compared to teams containing fewer women."
Also, simply adding more people doesn’t help, either:
"If your group is trying to answer a relatively simple question that has one definitive answer, having a large number of people around the table can be helpful. But when teams are asked to make more complicated decisions, simply adding more members won’t help, [psychologist Mirta] Galesic says. To figure out the optimal size of a team when trying to tackle a mix of simple and complex questions, it’s important to know the expertise of the people in the room. If they are very good on most tasks, but occasionally make errors, then — statistically — the right number of people is often between three and fifteen, she says. After that, the returns will diminish.'The accuracy of the group’s majority will increase quickly on easy tasks and decrease slowly on difficult tasks,' Galesic says. After a certain point 'the larger the group size, the accuracy can only fall. You’ll reach a peak at some point and there’s no more return to adding experts.”'
Related Article: Collaboration Overload Is Crushing Innovation at Your Company
Put The Myths Aside
So we have learned the lessons about noise and bias. We have internalized the latent danger in various myths — deeply ingrained preconceptions — about how to go about decision-making. So what is the synthesis of these factors: how should we make decisions?
Well, the short version is we have to slow down decision-making, to avoid the human impulse to let our monkey brains jump to conclusions and then justify those decisions with made-up stories. We need to follow structured, systematic and preconfigured patterns that counter our impulse to oversimplify and sidestep research and self-reflection. I will discuss that in more depth in "You Have To Go Slow To Go Fast," the third article in this series.
Another inescapable factor is the political dimension. Social groups — teams, departments, the C-suite — need to confer on challenges and opportunities, and then decide on solutions. Who gets to decide, and how? That topic is explored in "Who Makes the Decision?" which looks at how decision-making at work is increasingly more democratic.
About the Author
Stowe Boyd's calling is the ecology of work and the anthropology of the future. He also writes extensively about work technologies.