The Gambler Who Broke Mathematics
In 1956, a physicist at Bell Labs named John Larry Kelly Jr. published a paper with one of the most boring titles in the history of mathematics: "A New Interpretation of Information Rate."1 It was eight pages long. It contained no stories about bullet-riddled airplanes, no witty anecdotes about baseball, no digressions about the nature of Swedish governance. It was, by the standards of popular science writing, a catastrophe.
It also contained the most important formula that most people will never learn.
Kelly had been thinking about a problem that sounds trivial: if you have an edge — some advantage, some reason to believe you'll win more often than you'll lose — how much of your money should you bet? Not whether to bet. Not what to bet on. How much.
This question sounds like it has an obvious answer: as much as possible. If you have an edge, exploit it. Bet the farm. Go big or go home.
And that answer is completely, provably, mathematically wrong.
Imagine a coin that lands heads 60% of the time. You know this. The house doesn't. You start with $1,000, and you can bet any amount on each flip. Heads, you win your bet. Tails, you lose it.
You have a 20% edge. This is enormous — bigger than any casino has, bigger than almost any edge in financial markets. The expected value of every dollar you bet is $1.20.
So what do you do?
The expected-value maximizer — the person who read Ellenberg's chapter on the lottery and decided to maximize EV in all things — bets everything on every flip. $1,000 on the first flip. Why not? The EV is $1,200. Then $1,200 on the second flip. Then $1,440. Exponential growth, baby.
Except here's what actually happens: on any given flip, there's a 40% chance you lose everything. Not 40% of your money. Everything. Your entire bankroll. In the long run, the probability that an all-in bettor goes broke doesn't approach some small number — it approaches 1. It's not a risk. It's a certainty.
This is the gap that Ellenberg dances around but never quite closes. How Not to Be Wrong spends beautiful chapters explaining why expected value matters — why the lottery is a tax on people who don't understand it, why survivorship bias misleads us, why regression to the mean is your friend. All true. All important. But it leaves a gaping hole right in the center of the argument: expected value tells you what to do, but it can't tell you how much to do it.
Kelly filled that hole.
The Formula That Connects Gambling to the Universe
Kelly's insight began with a connection that feels, at first, like it shouldn't exist. He was working in information theory — Claude Shannon's domain, the mathematics of communication and signal processing. Shannon had shown that every communication channel has a maximum rate at which you can transmit information. Kelly realized that a gambler with inside information has the same structure: a "channel" (the tip, the edge, the signal) with a certain reliability, and the question is how to transmit that information advantage into wealth at the maximum possible rate.
The formula he derived is almost obscenely simple:
- f*
- The optimal fraction of your bankroll to wager
- p
- The probability you win
- q
- The probability you lose (which is just 1 − p)
- b
- The net odds you receive on the bet (profit per dollar wagered)
For our 60/40 coin flip with even odds (b = 1):
That number — 20% — is doing something profound. It's not maximizing your expected value on any single bet. (Betting 100% does that.) It's maximizing the expected growth rate of your wealth over time. And those are not the same thing. They're not even close.
Expected value is additive: the EV of 100 bets is 100 times the EV of one bet. But wealth is multiplicative: your bankroll after 100 bets is the product of 100 growth factors. In multiplicative systems, volatility destroys wealth even when the expected value is positive. Kelly tells you exactly how much volatility to accept.
This is, I think, the single most underappreciated mathematical insight in the world. We live in a multiplicative universe. Your career compounds. Your health compounds. Your relationships compound. But we think in additive terms — "what's the expected outcome of this one decision?" — and it leads us catastrophically astray.
Try It Yourself
Abstraction is seductive. Let's get concrete. Move the sliders below and watch what Kelly recommends — and why.
Play with the simulation a few times. Notice the patterns:
- Kelly (green) almost always ends up on top over 500 bets. Not because it wins the most on any single bet, but because it never blows up.
- Half Kelly (yellow) is smoother, more conservative. You sacrifice some growth for a much less terrifying ride. Most professional gamblers and investors actually use this.
- Double Kelly (red) looks great sometimes and catastrophic other times. It's the mathematical equivalent of driving 140 mph — thrilling until it isn't.
- All-In (gray) dies. Every time. Eventually. The 60% coin can't save you from the law of large numbers working in reverse.
Ed Thorp and the Real-World Test
Mathematics is seductive precisely because it works on paper. The question is always: does it survive contact with reality?
In 1962, a mathematics professor named Edward O. Thorp published Beat the Dealer, the book that invented card counting. Thorp had proven — mathematically, rigorously — that blackjack could be beaten. The casinos, predictably, were annoyed. But Thorp's real secret wasn't card counting. It was Kelly.
Thorp used the Kelly Criterion to size every bet. When the deck was rich in face cards (high count), the player has an edge. Kelly told him exactly how much to bet. When the deck was neutral or unfavorable, Kelly said: bet the minimum. Not zero — the minimum, because you need to stay at the table to see the next hand.
The result: Thorp's bankroll grew at almost exactly the rate Kelly predicted. Not faster, not slower. The formula wasn't just theoretically optimal — it was practically optimal.
Thorp went on to found Princeton Newport Partners, one of the most successful hedge funds of the 1970s and 1980s. His annualized returns averaged over 20% for nearly two decades. His maximum drawdown was remarkably small. He never blew up. He never had a year that threatened to wipe him out.
The reason wasn't that Thorp was smarter than other investors (though he was). It was that he understood something they didn't: the size of your bet is a more important decision than the direction of your bet.
Long-Term Capital Management, by contrast, had several Nobel laureates on staff and some of the most sophisticated models on Wall Street. They knew what to bet on. They just didn't know how much. They used 25:1 leverage — the equivalent of betting 2,500% of your bankroll. Kelly would have told them this was suicide. Kelly was right.2
The Virtues of Cowardice
There's a persistent macho mythology around risk-taking. Fortune favors the bold. Go big or go home. Scared money don't make money.
Kelly says the opposite. Kelly says: be a coward. Be precise about exactly how much of a coward to be, and then be that coward with mathematical discipline.
Here's why half-Kelly is so popular among people who actually risk real money:
| Strategy | Growth Rate | Volatility | Max Drawdown | Survival |
|---|---|---|---|---|
| Quarter Kelly | ~44% of max | Very low | Small | Nearly certain |
| Half Kelly | ~75% of max | Moderate | Manageable | Very high |
| Full Kelly | 100% (optimal) | High | Severe | Theoretically certain |
| Double Kelly | Same as zero! | Extreme | Catastrophic | Very low |
Read that table again. Double Kelly has the same long-term growth rate as not betting at all. This is the mathematical punchline that nobody sees coming. Bet twice the Kelly amount and your expected geometric growth rate drops to zero. Bet more than twice Kelly and your expected growth rate goes negative — you're expected to lose money even though every individual bet has positive expected value.
Half Kelly gives you 75% of the maximum possible growth rate while cutting your variance in half. That's not a compromise — that's the best deal in mathematics. You give up 25% of your upside to sleep at night. Professional gamblers, elite poker players, and the best quantitative investors all converge on this independently.
The reason is that Kelly assumes you know your edge perfectly. You never do. In the real world, you think your win probability is 60%, but it might be 55% or 52% or, God forbid, 48%. Full Kelly with a misestimated edge is overbet Kelly — which, as we just saw, is where the dragons live. Half Kelly gives you a buffer for being wrong about how right you are.
Beyond the Casino
Kelly's insight extends far beyond gambling, because the underlying math — multiplicative growth, the tyranny of variance, the difference between expected value and expected growth — operates everywhere.
Investing
Warren Buffett has said he concentrates his portfolio in his best ideas. Charlie Munger has explicitly endorsed Kelly-like reasoning. When Buffett put 40% of his partnership's assets into American Express during the 1963 salad oil scandal, he was making a Kelly bet: high conviction, large position, but not so large that being wrong would be fatal.
Career Decisions
Your career is a series of multiplicative bets. Joining a startup is a bet. Choosing a field is a bet. Moving to a new city is a bet. Kelly says: size these bets in proportion to your edge (your genuine advantage) and your odds (the payoff structure). Don't go all-in on a single startup unless you literally can't lose. And you always can.
Relationships
This one's trickier, but the structure is there. The people who blow up their lives — affair, fraud, betrayal — are almost always people who overbet: they risked everything for a marginal gain. The Kelly framework says: the amount you should risk on any opportunity is proportional to its edge. Most temptations have small edges and catastrophic loss potential. Kelly says: don't.
Health
Your body is your ultimate bankroll. You can't re-buy. Kelly says: never take risks with your health that have no proportional upside. Extreme sports for fun? Sure — the utility is real. Skipping sleep to grind? The edge (a few more hours of mediocre work) doesn't justify the bet (long-term cognitive decline).
The Deepest Lesson
Ellenberg ends How Not to Be Wrong with a plea for mathematical thinking as a way of being in the world. I want to end this extension with something more specific.
The Kelly Criterion teaches you that the relationship between risk and reward is not linear. This is Ellenberg's "linearity is local" insight applied to the most important domain of all: your own life.
Twice the risk does not give you twice the reward. It gives you the same reward as zero risk. Three times the risk gives you negative reward. The curve doesn't just flatten — it inverts. And it inverts at exactly the point where hubris tells you to push harder.
This is, perhaps, the deepest mathematical argument against recklessness ever formulated. Not a moral argument — Kelly doesn't care about morals. Not an emotional argument — Kelly doesn't care about fear. A mathematical argument: recklessness is not just dangerous, it's suboptimal. The reckless person doesn't just risk more. They actually earn less.
John Kelly died in 1965, at 41, of a brain hemorrhage on a Manhattan sidewalk. He never saw his formula become the foundation of professional gambling strategy, hedge fund portfolio theory, or the mathematical philosophy of risk. He never saw Thorp validate it, or LTCM violate it.
But the formula survived, because mathematics doesn't need its authors to keep being true. And this particular truth — that the optimal strategy is never the maximum strategy, that the best way to win is to precisely calibrate how much you're willing to lose — is as close to a universal life lesson as mathematics has ever produced.
Notes
- Kelly, J.L. Jr. "A New Interpretation of Information Rate." Bell System Technical Journal, 35(4):917–926, July 1956. The paper frames gambling as a communication channel problem — the gambler receives information through a "noisy channel" (imperfect tips) and must determine the optimal rate of "transmission" (betting) to maximize long-term growth.
- LTCM's collapse in 1998 is often attributed to the Russian financial crisis, but the root cause was leverage. Their models were sophisticated; their position sizing was not. They knew what to bet on — they just bet too much. Kelly would have capped their leverage at a fraction of what they used. See: Roger Lowenstein, When Genius Failed (2000).