Subscribe to Annie's Substack

Why Quitting is Underrated

And grit is not always a virtue

Photo by Kevin Wang on Unsplash

This piece appeared in The Atlantic in September. I love it so thought I would share it here.

Thinking in Bets is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

Siobhan O’Keeffe, one of tens of thousands of runners in the 2019 London Marathon, noticed that her ankle started hurting four miles into the race. Despite the worsening pain, she continued running. Another four miles later, her fibula bone snapped. Medics bandaged her leg and advised her to stop running, but she refused. She actually finished the marathon, running the last eighteen miles in nearly unbearable pain.

Her story sounds fantastical. Running eighteen miles on a broken leg stretches the limits of believability. Even her orthopedic surgeon remarked as much. But what might be more unbelievable is that this story is not uncommon. In fact, that same day, at the same distance into the race, another runner, Steven Quayle, broke his foot. He kept running, through pain so bad that, during the final ten miles, he had to make four or five stops for medical assistance. But he, too, finished the race.

There is no central registry of such incidents, but it happens a lot. A quick Google search turns up many other stories of distance runners around the world suffering horrifying injuries and finishing their race.

These stories are so hard for us to swallow because I think everyone shares the intuition that it you were going to break your leg at the beginning of a marathon, you wouldn’t start in the first place. Dealing with pain and discomfort are expected in distance running, but I think we all agree that if we snapped our fibula at mile eight, we would stop.

Likewise, if we received equally clear signs we were in a losing situation in a job, or a relationship, or an investment, we would quit and do something else.

On the other hand, I’m guessing that part of you has admiration for those distance runners sticking it out. We look at these types of stories and think, “I wish I had that kind of grit.”

While grit can be an admirable quality, it isn’t always an unqualified virtue. Apart from the pain these runners were in, they were risking permanent physical damage, or at least making their condition sufficiently worse that it could cost them opportunities to run again.  

This is the issue with grit. It can get you to stick to hard things that are worthwhile, but it can also get you to stick to hard things that are no longer worthwhile – like after your fibula snaps at mile eight. Throughout our lives, it turns out we are not very good listeners when the world tells us that we should stop.

There are a host of reasons why we ignore obvious signals to quit, not the least of which is that quitting has a nearly universal negative connotation. If someone called you a quitter, would you ever consider it a compliment?

We view grit and quitting as opposing forces and, in the battle between the two, grit has clearly won. “Quitters never win, and winners never quit.” “If at first you don’t succeed, try, try again.” Or even what we all learn as children by reading The Little Engine That Could: “I think I can. I think I can. I think I can.” The popular aphorisms boil down to grit as a virtue and quitting as a vice.

This is even reflected in the way the English language itself favors grit. If you’re gritty, you’re steadfast, determined, resolute, or unwavering. You might even be considered heroic or courageous. Meanwhile, quitting means failing, capitulating, or giving up. Quitters are losers. Quitting shows a lack of character.

It’s high time we rehabilitated quitting. Quittiness is at least as important a decision skill to develop as grittiness, because quitting gives us the ability to act as new information reveals itself. The world changes. Our state of knowledge changes. We change.

Imagine if the first thing you ever did, you had to stick to. It would be almost impossible to start anything because you wouldn’t have the option to stop it. It’s the option to quit later that allows you to make the decision to go on a first date, even though you don’t know how it’s going to turn out. It’s the option to quit that allows you to take a job even though you don’t know if it’ll be the right fit. It’s quitting that allows you to start a marathon, even though you might injure yourself in the middle.

We are able to make up our minds amid constant uncertainty because, once we learn any of these things aren’t right for us, we have the option to quit. That’s why, if I had to skill somebody up to get them to be a better decision-maker, quitting is the primary skill I would choose, because the ability to walk away is what allows you to react to the changing landscape.

The problem is that, too often, we persist too long. In fact, when we get bad news, we frequently do the opposite of what our intuition tells us that we would do. Instead of quitting, we double down, escalating our commitment, further entrapping ourselves in a losing cause. We see it everywhere: People staying too long in bad relationships, bad jobs, and bad careers; businesses continuing development and support of products that are clearly failing or long after conditions have changed; a nation sticking for decades in an unwinnable war.


Nearly half a century of science has identified a host of cognitive forces that contribute to our inability to quit after getting bad news. The most well-known is the sunk cost fallacy, first identified as a general phenomenon by Nobel Laureate Richard Thaler in 1980. It’s a systematic cognitive error in which people take into account money, time, effort, or any other resources they have previously sunk into an endeavor when making decisions about whether to continue and spend more. In other words, we throw good money after bad.

The fear that you’ll have wasted what you’ve already put into something if you don’t continue, causes you to invest more in a cause that’s no longer worthwhile. That’s widespread behavior, whether it’s part of the continued commitment to a public works project that has gone off the rails, or why people won’t quit jobs they no like or leave relationships that have turned toxic.

Another commonly known error that keeps people from quitting is status quo bias, introduced in 1988 by economists Richard Zeckhauser and William Samuelson. In comparing options, individuals and enterprises overwhelmingly stick with the one representing the status quo, even when it is demonstrably inferior to the option representing change.

There are also cognitive and motivational issues surrounding our need to have our beliefs (and actions corresponding to them) validated as correct and consistent, and to be seen in that way by others. We don’t like it when our beliefs clash with new information that suggests that a belief that we hold is inaccurate. The uncomfortable feeling that we get is known as cognitive dissonance, the concept pioneered over sixty years ago by Leon Festinger. We can resolve that dissonance by changing our minds or rationalizing away the new information. Too often, as established in hundreds of studies, we choose the latter.

Our beliefs are really hard to quit.


In 2013, economist Steven Levitt, coauthor of the bestseller Freakonomics, put up a website inviting people who visited to flip a virtual coin to help them make a close decision about whether to quit or stick. You might be skeptical that people would go to a website to flip a coin to help them make a major decision. But 20,000 people over the course of a year actually did this, including nearly 6,000 considering a life-changing decision like whether to quit their job, or go back to school, or retire, or end a relationship.

Obviously, these people must have felt that the choice of whether to quit or to persevere was so close, so 50-50, that flipping a coin to help them decide seemed like a reasonable option. It stands to reason that if these decisions were, in reality, as close as the coin flippers felt they were, they would be equally likely to be happier if the coin landed heads or if it landed tails, whether they ended up sticking or quitting.

But this isn’t what Levitt found. When he followed up with the coin flippers two and six months later, he discovered that the people who quit were happier, on average, than those who stuck. While the decisions may have felt close to the people making them, they were not actually close at all. As judged by the participants’ happiness, quitting was the clear winner. That meant that they were getting to the decision too late, long after it was actually a close call. Given all the cognitive biases that work to prevent us from quitting on time, this shouldn’t be surprising.

That being said, life is murky and uncertain environments exacerbate these types of errors. Maybe when we’re in an environment where the motivation to get the decision right is clear, along with having data objectively telling us when it’s the right time to quit, we would do it. But we don’t. Even in those environments rich in motivation and information, decision-makers tend to persist too long.


The NBA and other professional sports leagues offer a unique environment in which to study quitting behavior. Decision-makers in pro sports get a lot of continuous, quick, clear feedback on player productivity. Pro basketball is a data-rich environment, with many objective measures of player performance, constantly being updated. The coach and team management are highly motivated to use the best players in the right situations and, obviously, to win.

In 1995, social psychologist Barry Staw and colleague Ha Hoang looked at escalation of commitment and quitting following one of the common, high-stakes decisions made by expert sports executives: The NBA draft. Did a basketball player’s draft order affect later decisions – independent of their on-court performance – on their playing time, likelihood of being traded, and career length?

They analyzed players and their draft order in the 1980–1986 NBA drafts, nine measures of player performance, minutes of playing time for five seasons, career length, and whether a player was traded.

It turned out that draft order did have an independent effect on future playing time and roster decisions. “Results showed that teams granted more playing time to their most highly drafted players and retained them longer, even after controlling for players’ on-court performance, injuries, trade status, and position played.”

This is where you can clearly see the effect of cognitive errors like the sunk cost fallacy. Spending a high draft pick to acquire a player burns a valuable, limited resource. Benching or trading or releasing such a player, despite performance data justifying it, feels tantamount to wasting that resource, so those players get a lot more chances than players drafted lower who are playing as well or better.

These findings can’t be dismissed as a relic of the pre-Moneyball era. Economist Quinn Keefer has conducted several field studies since the mid-2010’s on the effects of draft order and player compensation on playing time in the NFL and the NBA. Although the effect sizes were diminished, they were still significant, replicating the original findings from the 1980’s and 1990’s.

This begs the question: If pro sports teams, with their armies of analysts and constant pressure to win, are making this error, what’s happening in our everyday lives? What relationships are we staying in too long? What employees are we holding onto that we should be laying off? What jobs are we staying in that we should be walking away from? What beliefs do we cling to that we should have updated long ago? What marathons are we continuing to run with a broken leg?


What makes quitting so hard is that, if we quit, we fear that we will have failed and wasted our time, effort, or money. When we worry that quitting means failing, what exactly are we failing at? If we quit something that’s no longer worth pursuing, that’s not a failure. That’s a success. We need to start thinking about waste as a forward-looking problem, not a backward-looking one. That means realizing that spending another minute or another dollar or another bit of effort on something that is no longer worthwhile is the real waste.

We need to redefine failure. We need to redefine waste. But ultimately, we need to rehabilitate the very idea of quitting.

Contrary to popular belief, winners quit a lot. That’s how they win.

Thinking in Bets is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.