Avoiding the Expert Trap

  • Jonah Sachs
  • Founder, Free Range

The new Trump administration presents us with endless questions about what our future will look like. In the rush to get our bearings, we shouldn’t lose sight, however, of one remaining puzzle about the past: Why couldn’t the experts see this coming?

That the experts were wrong about the primaries, the general election and even Trump’s transition from candidate to president has been consistently pointed out. But we shouldn’t be surprised. This type of blindness to sudden change is common and constantly undermines the ability of very good thinkers to make good decisions.

A large body of evidence shows that experts not only have a propensity for being wrong, but once they make a wrong prediction, they tend to stick with it despite important counter-evidence. From 2016’s demise of the political expert, the rest of us can draw some important lessons.

There’s probably no one better to learn from than the world renowned “expert on experts,” UC Berkeley psychologist Philip Tetlock. Tetlock began his work in 1983 by choosing 284 people who make their living “commenting or offering advice on political and economic trends.” He asked them, year after year, to predict the likelihood of various large changes happening in the near future — revolutions, wars, political movements. By the time he wrapped up his study in 2003 and the predictions had proven to be right or wrong, he had collected 82,361 forecasts. Along the way he amassed a lot of data about how the experts made their predictions and how confident they were in them.

So how did Tetlock’s experts do? Terribly. If he had simply blindfolded these people and had them throw darts at various predictions, they would have done better. In the aggregate, all their studying, reasoning and analysis not only didn’t help, it actually decreased the likelihood that they would be right.

But not all of Tetlock’s experts performed equally. For example, take the “super experts,” those who were frequently quoted, highly paid and well-known to the public. They stood apart from the crowd — by being far worse in their predictions. Tetlock determined that accuracy is inversely related to an expert’s renown, self-confidence and beyond a threshold, depth of knowledge. Think about that before you make your next prediction about the future of your business.

The more confident and informed you think you are, the more likely you are to be wrong.

And once you make a wrong prediction, you’re likely in for the long haul as your brain is programmed to stick to that path, ignoring the evidence that you may be off-base. In fact, neuroscientists have shown that when we find evidence to confirm an already held belief, we get a hit of dopamine, the same chemical that is released when we drink alcohol, have sex or fall in love. This is why confirmation bias, is one of the strongest and most dangerous quirks of human psychology. And it’s why so many continued to see Trump’s candidacy a fluke, destined to flame out, even when the evidence was piling up to the contrary.

The experts who dismissed Trump were relying on what researcher Gary Klein calls “expert intuition.” The better we get at our jobs, Klein explains, the faster we can make good decisions in familiar situations. Like a chess master, we glance at a board and can, without conscious thought, dismiss thousands of possible moves. Expert intuition makes us enormously efficient but it’s notoriously bad at recognizing that the game has changed. As we enter unfamiliar territory, expert intuition tries to fit our situation into old boxes so that we can get back to making good snap judgements. But here, Klein says, we need to intentionally turn off our expert intuition. “You must disconnect old dots to let new ones connect on their own.”

Tetlock found that there were indeed some experts who could do this. He called them “foxes” (as opposed to the hedgehogs, who put their heads down and stuck with a single line of thinking). Foxes tended to be more humble, more modest about their ability to predict the future and more curious about following up when they had been wrong. In other words, there was a lot less ego involvement in their thinking. This may explain why super experts so often stink. Money, prestige, a reputation to defend — they all cause our egos to invest in a point of view. Leaders of companies face the same problem. They’ve gained their positions, paychecks and authority from being right. It’s harder to be humble and flexible from such a perch.

After studying expertise and intuition for decades, Klein offers this advice for confident experts: “What you need instead is the Zen discipline of beginner’s mind in every situation.… It does not mean that you fail to master your craft. It is a step beyond mastery, when you clear your mind of all you know the moment you step onto the field of battle.” Randy Komisar, from Silicon Valley VC firm Kleiner Perkins Caufield & Byers offers the same advice about picking companies. “[It’s] about stripping away one’s biases, prejudices, blindness. It is about realizing the essence of things.”

So how do we avoid these biases and traps?

How can we benefit from our own expertise without being fooled by it? After reading an awful lot of the science on this, I recommend three important, steps:

  1. Take your time taking a position: A common strategy for leaders is to quickly analyze a situation and then take a provisional position on it. Our teams expect that of us and decisiveness is often seen as a sign of competence. The idea of a provisional opinion, of course, is that one’s position can and will be updated as evidence comes in. But this rarely happens. The minute we ego identify with A or B, our thinking becomes less flexible, confirmation bias kicks in and we engage in motivated reasoning even when we think we are seeking out contradictory evidence. Expertise can be a powerful tool to analyze and observe all the evidence available to us, making judgements on each fact we take in. But avoid having these judgements add up to a prediction or position until you are absolutely required to have one. And when you do take one, make sure that you call it provisional and work hard to poke holes in it (knowing you can only to a point).
  2. Track your logic: Researchers from Cornell recently studied a large and unique Reddit community called “Change My View.” In this forum, people share a view they hold and then challenge the community to change it. The discussion is lively, intense but almost always civil and productive. One of the most interesting findings in the research was that individuals who carefully explained the logic behind their starting view were much more likely to update it based on good evidence. This is likely because by breaking down a belief into its component parts, we need not greet new evidence with an all or nothing mindset. Had Trump detractors clearly mapped why they thought he couldn’t win, they would likely have seen one tenet of their belief after another break down and could more easily have changed their opinion. Instead, they held onto a broad dogma that he simply wasn’t a viable candidate and in that they were unmovable. When one reason for that belief broke down, they’d simply abandon it and find another to replace it — all without much conscious thought. We can all be better experts by recording the components that support a particular belief and then being open to seeing those tenets disproven. The brain seems to do better with confirmation bias when a belief is taken apart piece by piece.
  3. Breathe: If confirmation bias makes us seek out what we already know, sunk cost bias makes us feel a sense of loss for changing course. It’s another key reason that experts are so slow to update wrongheaded positions. “I’ve already put so much into this,” the thinking goes. “If I change my belief, all that I’ve invested will be lost.” So we hold on until the bitter end. Sunk Cost bias is universal but a 2014 study offers hope. Researchers had participants meditate for 15 minutes, just bringing mindfulness to their breathing. Then they put these participants into a situation known to stimulate sunk cost bias. The 15-minute meditators were an amazing 77% less likely to fall prey to the bias than their non-mediating counterparts. When considering an important viewpoint, we seem to be enormously advantaged if we first calm our emotions around it, center ourselves and simply remember to breathe. Though it sounds like magic, it’s strongly supported by science.

Taking these three simple actions can make us all more effective experts, somewhat counterintuitively, by resisting our own expertise. Political pundits certainly could have used such thinking over the last twelve months.

I’ll admit, it’s hard to blame the experts for being wrong about Trump. I was too. He is the product of rapidly changing and hard to predict times. But the real folly was to stay wrong about him for so long. Each time they estimated his power to be smaller than it in fact was.

If we want to save our country, we need do more than question Trump’s legitimacy. We need to question themselves. We all do in times of change.

Want tips on avoiding the experts trap? Download exercises for your team:

Help Your Team Avoid the Expert Trap 2 (6 downloads)