Subscribe to Annie's Substack

“Don’t Have a Crystal Ball?” — Annie’s Newsletter, January 18, 2019

DON’T HAVE A CRYSTAL BALL? - Base rates are the next best thing. "WANNA BET?" ON THE STOCK MARKET - An illustration of thinking in bets when you have little knowledge or expertise. THE PARADOX OF UNCERTAINTY- Counterintuitively, being more uncertain leads to greater accuracy. CIVIL DISAGREEMENT? ON THE INTERNET? - Reddit's Change My View forum.

DON’T HAVE A CRYSTAL BALL?

Base rates are the next best thing

Walter Frick offers “3 Ways to Improve Your Decision Making” in a piece from last January in HBR. These are great concepts, succinctly explained and illustrated by Frick.

    • Rule #1: Be less certain. Obviously, a great first principle. He presents the problem of overconfidence and a solution: “practice aligning your level of your confidence to the chance that you’re correct.” I would recommend this slight, but signifiant, difference in frame: align your confidence to the chance that you’re incorrect.
    • Rule #2: Ask “How often does that typically happen?”Asking this question drives us to consider base rates in our decision making. It moves us from the inside view to the outside view, disengaging ourselves from overweighting our personal experience and opinions to arrive at a more objective perspective.
  • Rule #3: Think probabilistically – and learn some basic probability. Benefitting from uncertainty and open-mindedness takes practice and becoming comfortable with probabilistic thinking helps us get there.

What a great approach to decision making!

I’ve written so much about Rules #1 and #3 that the piece gave me a chance to focus a little on the concepts in Rule #2.

When we’re making decisions in a world with so much information that’s hidden or unknown, asking “How often does that typically happen?” can be the best starting point in making a prediction. (Or the best ending point, if you are severely limited in how much you can find out.)

Obviously, it would be great if we were all psychic and could know how the future would turn out. But we aren’t.

Knowing the base rate is the next best thing to psychic ability. How often something has occurred in the past is usually a good indicator of how often it will occur in the future.

Base rates are like having a crystal ball.

Seeking out the base rate in our decision process reminds us to not put so much weight on our own experience and own opinions.

Always asking, “how often does this typically happen?” helps us view the decision from the outside in, rather than the inside out.

Bob Seawright wrote a terrific piece on the subject of base rates last June, “Proof Negative.” I wrote about this piece in the July 16 newsletter, and tweeted, “I want to quote every paragraph.”

He starts with an illustration of the disparity between the failure-rate in relationships compared with people’s opinions on the probability their relationship could fail.

“We may recognize that divorce is commonplace. But whatever version of the statistical landscape brides and grooms might dutifully be able to recite, we simply don’t think those probabilities are personally applicable. For example, recent research found that study participants thought the average member of the opposite sex has about a 40 percent chance of cheating on his or her partner. But those same participants said their own partner had only a de minimus chance of cheating. When you fall in love, all bets are off. When I fall in love, it will be forever.”

If we were naturally good at taking the outside view, we’d recognize that the probability of something happening to everybody else in the world applies to us as well.

A lot more couples would have pre-nups then, right?

It’s a good example of how we overvalue our personal beliefs to the point of ignoring how often something typically occurs.

Knowing how often something has happened in the past is usually the best indicator of how often it will happen in the future.

If a mysterious stranger offers to sell you a crystal ball, it’s caveat emptor.

Finding out the base rate, however, is a cheap, reliable alternative.

This leads to my reaction to a prediction I recently read ….


“WANNA BET?” ON THE STOCK MARKET

An illustration of thinking in bets when you have little knowledge or expertise

At the end of December, I saw a CNBC article (written earlier in the month) reporting on a Citi analyst’s prediction that there was “a 90% probability of [stock market] gains in the next 12 months”.

So the analyst is saying the odds are 9-to-1 against the market going down.

I tweeted that I’d take that bet:

When I posted that, I imagine some people thought I was claiming some expert opinion on the stock market, that I actually had an opinion on whether the market would go up or down this year.

But that is not at all why I was willing to take the bet.

I actually don’t have especially well-informed beliefs on issues like the direction of the economy or the international trade situation.

But what I do have is an understanding of the importance of base rates.

While reading the CNBC article, I happened to be sitting with someone who does know quite a bit about the stock market.

But I didn’t ask him his opinion on the article or the prediction. I didn’t ask about trade or earnings or interest rates.

Instead, I asked just one question:

“Historically, how often is the stock market up year-over-year?”

I wanted to approach the soundness of the forecast through the simplest way I could think of: by asking how far off the base rate the prediction was.

He said, “I believe it’s around 70%.”

That was enough for me to be willing to bet getting 9-to-1 because the prediction was so far off the base rate.

One commenter on the tweet, Doug Buchan, nicely got the point:


THE PARADOX OF UNCERTAINTY

Counterintuitively, being more uncertain leads to greater accuracy

Jay Van Bavel recently brought to my attention a counterintuitive result from a 2008 study by Katherine Phillips and colleagues, “Is the Pain Worth the Gain? The Advantages and Liabilities of Agreeing with Socially Distinct Newcomers.”

The study found that bringing uncertainty into a group increased the group’s ability to correctly identify the perpetrator in a “Murder Mystery” scenario.

Diverse groups were more likely to be right, and they expressed greater uncertainty in their answers. 

Van Bavel brought up Phillips because Steven Johnson had cited her work in his recent book, Farsighted: How We Make the Decisions That Matter the Most. Johnson nicely summarizes the findings:

“They were both more likely to be right and, at the same time, more open to the idea that they might be wrong. That might seem to be a paradox, but there turns out to be a strong correlation between astute decision-making and a willingness to recognize—and even embrace—uncertainty. Phillips’s findings echo the famous Dunning-Kruger effect from cognitive psychology, in which low-ability individuals have a tendency to overestimate their skills. Sometimes the easiest way to be wrong is to be certain you are right.”

Counterintuitively, uncertainty actually increases accuracy.

When we’re aware of our own uncertainty, we’re looking for why we’re wrong. We’re more likely to consider other opinions. We’re less likely to be overconfident in whatever answer we find.

When we’re more uncertain and less confident, we’re more likely to be open-minded, leading us to take the outside view.

Daniel Kahneman tells a great story of an “embarrassing episode” in textbook publishing which nicely illustrates why Phillips and colleagues might have gotten their result.

The story is in chapter 23 of Thinking, Fast and Slow, an excerpt of which appeared on McKinsey.com as “Beware the ‘inside view.'” It’s a such a good example that Walter Frick also summarized it as part of Rule #2 in his HBR piece.

After spending a year in meetings on the outline of a textbook and accompanying curriculum with several colleagues, Kahneman asked the group to write down estimates of how long it would take to submit a finished draft. (This method was actually one of the points of the curriculum on how to elicit information.)

Estimates varied between 1-1/2 and 2-1/2 years.

Kahneman then asked Seymour Fox, one of the collaborators who knew about how such projects typically went, how long the typical textbook takes to finish.

In other words, Kahneman asked Fox for the base rate.

“He fell silent. When he finally spoke, it seemed to me that he was blushing, embarrassed by his own answer: ‘You know, I never realized this before, but in fact not all the teams at a stage comparable to ours ever did complete their task. A substantial fraction of the teams ended up failing to finish the job.'”

Forty percent of such projects were never completed, and the remaining 60% took 7-10 years to complete.

In the end, the project took eight years to complete, and Kahneman had long ceased to be part of the project. The Israeli ministry that encouraged the textbook and development of the curriculum ended up never using it.

Their unrealistic optimism demonstrates the importance of seeking out base rates as part of your decision process.

And if Daniel Kahneman can get trapped in the inside view, it’s no wonder newly married couples don’t think they’ll ever get divorced.

There’s a nice summary about the “inside view vs. outside view” on PersuasionReadingList.com, in a piece with that exact title.

We know that our own experience is valuable, but it also has the potential to distort the overall picture. The outside view provides a check on overvaluing the inside view:

“The Outside View is how we view other peoples’ situations. From the outside, these are common story lines. We can expand the frame of reference and pick up on patterns. We focus on the common threads and common outcomes. We remove emotion. We make better informed decisions.”

That’s why acknowledging uncertainty makes us more open-minded.

We’re open to the idea that we can be wrong.

We’re open to the idea that our initial instinct or our personal experience may not provide a definitive answer.

We’re open to diverse viewpoints (including base rates, a potential shortcut to “What’s the rest of the world’s experience?”).

In the Phillips study that captured the attention of Steven Johnson, then Jay van Bavel, and then me, introducing social diversity into the group reminds members that other people might look at things differently.

Even if socially diverse members of the group don’t bring differing beliefs, having a group composed of people with different backgrounds and experiences makes us less certain and more open.

Uncertainty is the secret sauce.

Once we embrace it, we become more open to why we might be wrong.

And that’s more likely to get us to the truth.


CIVIL DISAGREEMENT? ON THE INTERNET?

What’s the catch?
Reddit’s Change My View forum

Truthseeking definitely works better as a group endeavor. After all, how can you really tell if you’re being open-minded if you’re not exposing yourself and your beliefs to other points of view?

There’s at least one place online where you can invite a large audience “to explain why they disagree with you in pursuit of greater understanding.”

It’s the subreddit, Change My View (CMV).

Founded by Kal Turnbull in 2013, it has over 680,000 subscribers. (H/t Barry Ritholtz for pointing me to CMV, its information website, and a recent Atlantic article by Kiley Bense, “Civil Discourse Exists in This Small Corner of the Internet.”

This is an in-progress experiment of a community coming together to practice organized skepticism, creating a decision pod to approach the world from the perspective of “why am I wrong?”, and being grateful when your view is changed, calibrated, or more accurate.

The irony is not lost on me that this was, essentially, the promise of the Internet Age: unprecedented access to information and ability to communicate with the whole, diverse world. 680k on CMV isn’t nothing, but Twitter has over 300 million monthly active users and Facebook has over 2 billion.

But, as we know now, the internet amplifies our natural affinity for confirmatory thought. We fall into echo chambers, partly because it is easier to find those who agree with even the most extreme and outlandish opinions.

We can shut out dissenting voices or, when confronted with those who disagree, band together to pile on and shout down the offending opinion.

But unlike how discussions on controversial topics frequently devolve on social media, people on CMV aren’t rude or hostile, and discussions are constructive.

CMV has five rules for submissions and five for comments. They are all short principles, like Comment Rule 2:

“Rule 2 – Rude/Hostile Comment – Don’t be rule or hostile to other users. Your comment will be removed even if the rest of it is solid. ‘They started it’ is not an excuse. You should report it, not respond to it.”

CMV has 22 active moderators enforcing the rules, and subscribers follow them.

Bense summarized the environment in her article:

“While most users’ opinions aren’t turned 180 degrees, shifts in thinking and perspective regularly occur …. Change My View is the opposite of an echo chamber, where users reinforce the ideas that the group already holds and police anyone who tries to dissent. Instead, dissent is the point.”

Turnbull, quoted in the article, explained how CMV shifts the “debating on the Internet” game from being right (affirming what we already believe) to being accurate (learning the truth of the matter): “People feel that changing their view is somehow losing … that it’s this embarrassing thing. We are trying to change that perspective.”

One of the moderators, Brett Johnson, also recognized the forum’s potential to encourage organized skepticism: “Most places on the internet, most places in the world, they reward you for being right. But this was a community that celebrates being wrong.”

The end of the Atlantic article suggests that subscribers may be gaming the forum. Amy Bruckman, part of one of several groups conducting academic research on CMV’s rules and reward system, said, “It’s not clear to me that many people on Change My View really change their views.”

This is a challenge I’ve thought about when it comes to organizing decision pods: How do you make sure the debate system doesn’t get stale or become a rhetorical game, more about scoring points than actually changing or calibrating beliefs?

I’d be interested in hearing people’s thoughts on how to keep the “game” aspect from becoming more important than the “organized skepticism” aspect.

It’s worth noting that Bruckman added that, despite her reservations on minds actually changing, “our data suggests that everyone walks away with a broadened perspective, and that’s absolutely of value.”

That makes sense. After all, John Stuart Mill provides several arguments for the value of exchanging opinions, even when there is zero chance a particular view is correct.

Even if you’re sure that the earth is round, you don’t really KNOW it’s round unless you can defend that belief against someone presenting the strongest flat-earth arguments.

This forum offers a way to know your own truth better in that sense.

(I’m stating the obvious, but read Chapter 2 of On Liberty, or the student-accessible All Minus One, edited by Richard Reeves and Jon Haidt, on “Mill’s second argument: ‘He who knows only his side of the case ….'”)


THIS WEEK’S VISUAL ILLUSION

Another perspective on the “rotating disks” illusion

From Akiyoshi Kitaoka: