There is a famous mathematical paradox called the St. Petersburg Paradox. It involves the playing of a game where you flip coins. In the version of the game I’ll use below, the first time you get heads, the pot starts at two dollars, and every head you flip after that doubles the pot. As soon as you get tails, the game ends, and you get whatever is in the pot. Now imagine yourself at a carnival, and some barker is selling this game. What would you pay? A dollar a game? Two? Five? Twenty? Keep in mind that you can play more than once, assuming you have the cash. What’s a fair price?
A quick analysis will tell you that you should win half the time, and that you’ll win at least two dollars. So a one dollar investment should always be covered. But what if the carnival is selling the game for more than a dollar? Half of all the winners will be two dollars, but the other half of the winners will be four or more dollars. Your winning rounds should average above three dollars, so you’ll average at least $1.50 in winnings per round if you keep playing, so a buck-fifty is a fair price, right?
You can continue that logic and come up with higher fair prices. The claim from some mathematicians is that this game will justify any stakes, because the payoff is infinitely increasing over time, and so no matter what you pay, you will make money on this game. And yet most people are only willing to pay $10 or $20 to play. Some people would only pay two dollars. This is often cited as a prime example of how people’s intuition about statistics is wrong, and irrational, and our pathetic little ape brains are not up to the task of guiding us through the simplest of decisions.
I’ll cut to the chase right now and tell you that, despite the vast number of books in print and articles on the Internet on this subject, the most popular description of this paradox is about 90% bogus. It turns out that people are spot on in their hesitation to play this game, and if anything, most people are willing to pay too much, not too little.
First a bit of clarification. I’ve described the problem in one particular way, where getting heads first loses. Other descriptions of this game begin differently. Some start with a two dollar pot, and some with a one dollar pot (for easy mathematical modeling — the value of the pot is two to the power of the number of head flips, even if you have zero head flips). In these versions, you literally win something no matter what. It turns out that none of this matters, and I’ll show you why in a bit.
So why do armchair mathematicians think it can be played at any price? Mathematicians want to know how this game will go over time, in theory. So they use a statistical method that adds up ALL the possible rounds all the way out to infinity. In other words, how does the game trend in the long run? And what they find is that if you play an infinite number of rounds, you will make an infinite amount of money, guaranteed.
And this is the part they got right. It is the only part they got right, and it is in fact a very important part if you’re immortal and want to spend the rest of eternity playing the most boring game ever devised.
But past that, the popular telling of this tale gets pretty much everything about the paradox wrong. Because apparently, most of us are not immortal, and would probably never play this game more than a few hundred thousand times in our insane quest to get rich and rule the world. And herein lies the problem.
At first glance, a game that never stops paying money sounds great. But the problem is, by running the game to infinity, mathematicians are essentially hiding a divide-by-zero mistake in their calculations. Or in this case, a multiply-by-infinity mistake, which turns out to be about the same thing.
One problem is with English: “playing to infinity pays infinitely” is not the same as “never stops paying money”. You might assume that if it pays an infinite amount, then playing only for a little while should pay a large, finite amount. But it doesn’t, because what the mathematicians fail to tell you (and some of them fail to think about entirely), is about when this game will pay off. How soon do I get rich? And it turns out, that as the stakes increase, this game takes longer and longer to become profitable. Not just linear longer and longer, but exponential longer and longer.
I started my investigation by writing a program to simulate the game. And I chose two dollars for the stakes. I had this game running millions of rounds per second. And this thing made money hand and foot. I was rich (virtually). But then I tried changing the stakes to twenty five dollars and this is when I knew something was wrong. After millions, even billions of rounds, I was still losing money. This made no sense if the paradox claims were correct. Profitable at any price should mean even like a thousand dollars a game, or a million. But twenty five bucks was a bust?
So what’s happening here? Basically, as the stakes go up, you remove winning scenarios. At four dollars a play, you don’t make any profit in a round unless you get three or more heads in a row, which only happens 12.5% of the time. But it’s worse than that, because to get to that winning round, you had to play other losing rounds where you were losing money, so you have to play even longer to make up for the money you lost waiting for a winning round. So you have to wait even longer, waiting for a really big win. It starts to sound like a losing proposition, but don’t worry — the mathematicians are right — that really big win is coming. they just forgot to tell you how long you’d have to wait, if the stakes were too high.
That’s the street view of the game, but it will also help to see an aerial view.
This is a graph of a typical game, played with no stakes at all. Of course, you make money no matter what. The graph either stays flat or goes up, and by the end of the 1800 rounds played here, we have made $14,370. We can use this same dataset of coin tosses, and change the stakes. It’s easy. For each round that we played we just subtract away your stakes. This simply bends the graph downwards. You can see this in the graph below — red represents rounds where you have less money than you started with. Eventually the stakes are large enough that we do not make money.
We can do this for any finite set of rounds. No matter what great luck you might have in some set of coin tosses, we can merely set the stakes to higher than the value of your winnings. Even if you imagine a set of coin tosses in which you win a million dollars every single round, we can just set the stakes to two million dollars and you lose.
Incidentally, this is why I say that winning or losing money on that first tails coin toss is irrelevant. Whether we win that coin or lose it, we can erase it either way if the stakes are high enough. It’s just a fixed extra benefit of one or two dollars depending on the rule system in place.
This clearly disproves the most popular myth about this game. It is not profitable at any price, unless of course you can actually play forever (and actually want to). If you can set the stakes to make a ridiculously unlikely game unprofitable, clearly you can also set the stakes such that a typical game is unprofitable.
So the game is only infinitely profitable if you play for infinity. It is theoretically profitable, but it will quite literally take forever to guarantee a profit. For any time less than infinity, it will only produce finite winnings which are easily offset by some level of stakes.
At this point, you might be wondering, is the paradox even true at all? By these graphs, it looks like you’d never make money. Yes it looks that way, but it’s because the graphs don’t go all the way to never. Look at it this way — if you have forever, you are guaranteed at some point to have a giant win that offsets whatever you have paid. Because, FOREVER. That’s what infinite means. Infinity offsets any bad odds, no matter how bad. But it’s also why I said that it’s a cheat for mathematicians to take this game to infinity. Multiplying by infinity really is the same as dividing by zero. You get an answer that is meaningless for us. Undefined. We’ll never live to see the payoff that makes this game profitable.
So how much should you pay to play this game then? Well, there is always the chance of lightning striking. You could pay whatever price, and get lucky. That first round you play could pay you a billion dollars. But the odds are, basically, a billion to one against that. So mostly, you would not win a billion dollars.
This is important because it turns out how long you are going to play affects your odds of winning. This should be obvious to anyone that’s played the lottery. Your odds of winning if you get 100 lottery cards is better than if you get only one. But, if you could put down as many lottery cards as you wanted, there is obviously a point at which it becomes likely that you would win. Just cover half the possible numbers and there’s a 50/50 chance of winning. Cover all the numbers and you are guaranteed to win.
In the St. Petersburg paradox though, there’s no way to play long enough to cover all the numbers, (except, you know, infinity) because you never run out of ever decreasing odds for ever greater winnings. But it turns out that it is possible to cover half of the chances, just like in the lottery. In other words, if you play long enough you eventually reach a 50% chance of making a profit.
Now I’m pretty good with math concepts, but honestly, I was never good at that symbolic math stuff. I could probably figure it out if I had to, but since I’m a programmer I just wrote a program to find the answer empirically. It tries different stakes, and different numbers of rounds, millions of times, until it approximates where the 50% winning point is reached for any given stakes.
If you pay a dollar a game, you’ll have a better than 50/50 chance of making a profit after only three rounds. If you pay two dollars per game, you’ll have to play seven rounds for that same 50/50 chance. $3 requires 23 rounds, $4 takes 88 rounds, and a stake of $5 needs 350 rounds. For a ten dollar stake, you’d have to play about 360,000 rounds before you have a 50/50 chance of making money. Based on this empirical data, I came up with an equation that approximates this expansion. To have a 50/50 chance of profit, you have to play (4^s)/2.9 rounds where s is the stakes. By twenty dollars a game, nobody could live long enough to have a good chance of making a profit. If I extrapolate these values out, a stake of $100 dollars would be profitable after about 554116566985858715704124859427987104318001032338894081138405 rounds.
Whoa. So this claim that you should pay any price for this game because it pays infinitely well? It’s not just wrong, as I showed before, it’s horribly wrong. It’s the WORST. ADVICE. EVER. You should pay almost nothing for this game. The people willing to pay twenty dollars per round for this game are being irrational because that is too much, not too little, to pay. It turns out our human intuition on this game is better than your average armchair mathematician’s insight.
Now we are prepared to address how much to pay for this game. If we are talking about actually playing the game with real cash and real coins, each round might be completed in about 15 seconds. If so, then a seven dollar stake would need about 5,600 rounds or 24 hours to reach a 50/50 chance of being ahead. Even if you are ahead at that point, you may not have made minimum wage for your time invested. At eleven dollars a game, you’d spend an entire year (with regular sleep) just to reach the 50/50 point. And at fifteen dollars a game, you couldn’t live long enough.
The only reasonable conclusion at this point is that were such a game offered to you, it probably makes sense to spend five bucks or less per round on it as an afternoon diversion. Someone who doesn’t care about money may make a case for spending more, and someone who only has a couple bucks to their name shouldn’t even gamble it, but these situations go beyond mathematical choice and on to more philosophical questions related to the value of money and how you want to live your life.
I should make it clear that I’m not claiming this as my discovery. Informed mathematicians are perfectly aware of the real payout of this game. In 1949, Hugo Steinhaus came up with an explanation which is vaguely similar to what I’m saying here, but one that is way more mathy. And there’s this web article with a pretty good explanation, using a fancy name of “ergodicity”, which as far as I can tell means that it’s a bad idea to multiply by infinity. Or to be more accurate, it means that in this particular problem calculating over a finite set of rounds is reasonable, while attempting to calculate a complete set of all possible outcomes is not reasonable.
Heck, even Wikipedia offers up several explanations of why this paradox doesn’t quite pan out. Despite this, many amateur and even professional mathematicians still recite this paradox as the height of theoretical perfection, or a basis for maligning human intuition. Why does this alleged paradox persist? The truth is out there, but the explanations tend to be rather complicated. Hopefully I’ve managed to demonstrate the reality of this so-called paradox without the need for higher math and words like “ergodicity”. Now if anyone tries to pull this trick on you, you can calmly and rationally explain to them that your ape intuition is better than their mathematical skills.