Home Book Notes Members Contact About
Scott Vejdani
Thinking in Bets: Making Smarter Decisions When You Don't Have All the Facts - by Annie Duke

Thinking in Bets: Making Smarter Decisions When You Don't Have All the Facts - by Annie Duke

Date read: 2019-12-25
How strongly I recommend it: 9/10
(See my list of 150+ books, for more.)

Go to the Amazon page for details and reviews.

Fantastic book to figure out how to make better decisions. Some key takeaways include: separating the quality of the decision from the outcome (they are mutually exclusive), calculating the probability that something will happen instead of a simple yes or no, premortem techniques, and many more.


My Notes

Over time, those world-class poker players taught me to understand what a bet really is: a decision about an uncertain future. The implications of treating decisions as bets made it possible for me to find learning opportunities in uncertain environments. Treating decisions as bets, I discovered, helped me avoid common decision traps, learn from results in a more rational way, and keep emotions out of the process as much as possible.

Thinking in bets starts with recognizing that there are exactly two things that determine how our lives turn out: the quality of our decisions and luck. Learning to recognize the difference between the two is what thinking in bets is all about.

Pete Carroll, head coach of the Seattle Seahawks, was a victim of our tendency to equate the quality of a decision with the quality of its outcome. Poker players have a word for this: “resulting.” When I started playing poker, more experienced players warned me about the dangers of resulting, cautioning me to resist the temptation to change my strategy just because a few hands didn’t turn out well in the short run.

Hindsight bias is the tendency, after an outcome is known, to see the outcome as having been inevitable. When we say, “I should have known that would happen,” or, “I should have seen it coming,” we are succumbing to hindsight bias.

When we work backward from results to figure out why those things happened, we are susceptible to a variety of cognitive traps, like assuming causation when there is only a correlation, or cherry-picking data to confirm the narrative we prefer. We will pound a lot of square pegs into round holes to maintain the illusion of a tight relationship between our outcomes and our decisions.

Poker, in contrast, is a game of incomplete information. It is a game of decision-making under conditions of uncertainty over time. (Not coincidentally, that is close to the definition of game theory.) Valuable information remains hidden. There is also an element of luck in any outcome. You could make the best possible decision at every point and still lose the hand, because you don’t know what new cards will be dealt and revealed. Once the game is finished and you try to learn from the results, separating the quality of your decisions from the influence of luck is difficult.

The quality of our lives is the sum of decision quality plus luck.

What makes a decision great is not that it has a great outcome. A great decision is the result of a good process, and that process must include an attempt to accurately represent our own state of knowledge. That state of knowledge, in turn, is some variation of “I’m not sure.”

When we think in advance about the chances of alternative outcomes and make a decision based on those chances, it doesn’t automatically make us wrong when things don’t work out. It just means that one event in a set of possible futures occurred.

Any prediction that is not 0% or 100% can’t be wrong solely because the most likely future doesn’t unfold. When the 24% result happened at the final table of the charity tournament, that didn’t reflect inaccuracy about the probabilities as determined before that single outcome.

In most of our decisions, we are not betting against another person. Rather, we are betting against all the future versions of ourselves that we are not choosing.

Our beliefs drive the bets we make. Part of the skill in life comes from learning to be a better belief calibrator, using experience and information to more objectively update our beliefs to more accurately represent the world. The more accurate our beliefs, the better the foundation of the bets we make.

Truthseeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information. We might think of ourselves as open-minded and capable of updating our beliefs based on new information, but the research conclusively shows otherwise. Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.

The smarter you are, the better you are at constructing a narrative that supports your beliefs, rationalizing and framing the data to fit your argument or point of view. After all, people in the “spin room” in a political setting are generally pretty smart for a reason.

It turns out the better you are with numbers, the better you are at spinning those numbers to conform to and support your beliefs.

“Wanna bet?” triggers us to engage in that third step that we only sometimes get to. Being asked if we are willing to bet money on it makes it much more likely that we will examine our information in a less biased way, be more honest with ourselves about how sure we are of our beliefs, and be more open to updating and calibrating our beliefs.

What if, in addition to expressing what we believe, we also rated our level of confidence about the accuracy of our belief on a scale of zero to ten? So instead of saying to ourselves, “Citizen Kane won the Oscar for best picture,” we would say, “I think Citizen Kane won the Oscar for best picture but I’m only a six on that.” Or “I’m 60% that Citizen Kane won the Oscar for best picture.”

Experience can be an effective teacher. But, clearly, only some students listen to their teachers. The people who learn from experience improve, advance, and (with a little bit of luck) become experts and leaders in their fields.

This initial fielding of outcomes, if done well, allows us to focus on experiences that have something to teach us (skill) and ignore those that don’t (luck). Get this right and, with experience, we get closer to whatever “-ER” we are striving for: better, smarter, healthier, happier, wealthier,

Blaming the bulk of our bad outcomes on luck means we miss opportunities to examine our decisions to see where we can do better. Taking credit for the good stuff means we will often reinforce decisions that shouldn’t be reinforced and miss opportunities to see where we could have done better.

A lot of the way we feel about ourselves comes from how we think we compare with others. This robust and pervasive habit of mind impedes learning.

Instead of feeling bad when we have to admit a mistake, what if the bad feeling came from the thought that we might be missing a learning opportunity just to avoid blame?

Keep the reward of feeling like we are doing well compared to our peers, but change the features by which we compare ourselves: be a better credit-giver than your peers, more willing than others to admit mistakes, more willing to explore possible reasons for an outcome with an open mind, even, and especially, if that might cast you in a bad light or shine a good light on someone else. In this way we can feel that we are doing well by comparison because we are doing something unusual and hard that most people don’t do. That makes us feel exceptional.

In explicitly recognizing that the way we field an outcome is a bet, we consider a greater number of alternative causes more seriously than we otherwise would have.

Because I agreed to the group’s rules of engagement, I had to learn to focus on the things I could control (my own decisions), let go of the things I couldn’t (luck), and work to be able to accurately tell the difference between the two.

When we think in bets, we run through a series of questions to examine the accuracy of our beliefs. For example: Why might my belief not be true? What other evidence might be out there bearing on my belief? Are there similar areas I can look toward to gauge whether similar beliefs to mine are true? What sources of information could I have missed or minimized on the way to reaching my belief? What are the reasons someone else could have a different belief, what’s their support, and why might they be right instead of me? What other perspectives are there as to why things turned out the way they did?

Be a data sharer. That’s what experts do. In fact, that’s one of the reasons experts become experts. They understand that sharing data is the best way to move toward accuracy because it extracts insight from your listeners of the highest fidelity.

Don’t disparage or ignore an idea just because you don’t like who or where it came from. Don’t shoot the messenger.

When I had the impulse to dismiss someone as a bad player, I made myself find something that they did well. It was an exercise I could do for myself, and I could get help from my group in analyzing the strategies I thought those players might be executing well.

Express uncertainty. Uncertainty not only improves truthseeking within groups but also invites everyone around us to share helpful information and dissenting opinions.

Lead with assent. For example, listen for the things you agree with, state those and be specific, and then follow with “and” instead of “but.”

Ask for a temporary agreement to engage in truthseeking. If someone is off-loading emotion to us, we can ask them if they are just looking to vent or if they are looking for advice.

Rather than rehashing what has already happened, try instead to engage about what the person might do so that things will turn out better going forward.

Improving decision quality is about increasing our chances of good outcomes, not guaranteeing them.

We can do this by imagining how future-us is likely to feel about the decision or by imagining how we might feel about the decision today if past-us had made it. The approaches are complementary; whether you choose to travel to the past or travel to the future depends solely on what approach you find most effective.

The 10-10-10 rule: “How would I feel today if I had made this decision ten minutes ago? Ten months ago? Ten years ago?”

For us to make better decisions, we need to perform reconnaissance on the future. If a decision is a bet on a particular future based on our beliefs, then before we place a bet we should consider in detail what those possible futures might look like. Any decision can result in a set of possible outcomes.

Figure out the possibilities, then take a stab at the probabilities. To start, we imagine the range of potential futures. This is also known as scenario planning.

Our decision-making improves when we can more vividly imagine the future, free of the distortions of the present. By working backward from the goal, we plan our decision tree in more depth, because we start at the end.

The most common form of working backward from our goal to map out the future is known as backcasting. In backcasting, we imagine we’ve already achieved a positive outcome, holding up a newspaper with the headline “We Achieved Our Goal!” Then we think about how we got there.

We start a premortem by imagining why we failed to reach our goal: our company hasn’t increased its market share; we didn’t lose weight; the jury verdict came back for the other side; we didn’t hit our sales target. Then we imagine why. All those reasons why we didn’t achieve our goal help us anticipate potential obstacles and improve our likelihood of succeeding.