A forum for good logic/math puzzles.

Moderators: jestingrabbit, Moderators General, Prelates

Superisis
Posts: 48
Joined: Fri Sep 17, 2010 8:48 am UTC

Don't know if this is supposed to go into the Mathematics section since it's more of a paradox than a puzzle, but I think it applies to this area.

This is a version of the dubble down betting policy (i.e. in a fair double or nothing game you bet double the amount you perviously bet if you lost that previous game. This way you always end up winning your original bet unless, of course, you run out of money first).

A gambler is offered to play a game. In order to play he must bet all of his money. The chance to win is 50%, the chance to lose 50% (no ties). If he loses his bet is forfeit. If he wins, he gains five times the bet. Thus, if the money he has is A, the expected value of the game is (1/2)*5A -(1/2)*A = 2A. He can play this game an unlimited amount of times. Should he play the game?

If yes. Then how many times?

t1mm01994
Posts: 299
Joined: Mon Feb 15, 2010 7:16 pm UTC
Location: San Francisco.. Wait up, I'll tell you some tales!

Not that much of a paradox, rather than a preference problem. I myself would probably play it once, and be done with it.

For maximum expected gain, play until you die / lose, whichever comes first (probably lose)
For any other gain, do however you please. Really, could you explain the paradox in this case?

Snark
Posts: 425
Joined: Mon Feb 27, 2012 3:22 pm UTC

Math tells you to keep playing indefinitely. Common sense tells you not to play or to stop after a reasonable amount of time (since you'll almost surely lose all your money at some point and very likely lose all your money within a dozen or less plays). Herein lies the paradox.

In the real world, it depends on the value of A dollars. If A was small, I'd test my luck probably 4 times or so. If A was large, I personally would hold onto my money.
Dashboard Confessional wrote:I want to give you whatever you need. What is it you need? Is it within me?

Avatar by Matt

mike-l
Posts: 2758
Joined: Tue Sep 04, 2007 2:16 am UTC

Essentially you get to play a single game where you win 1/2^n of the time and get 5^n dollars for it. Obviously the expectation increases in n, but most people's utility function is concave, so there is some point where the expectation of utility will maximize. Eg, I would prefer a 1/2 chance at a billion dollars over a 1/4 chance at 5 billion, because any amount of money over a few (probably tens of) million dollars has roughly equal utility to me.

Personally, I'd play until I had around 50 grand, that feels like roughly the point where quintupling my money only roughly doubles my utility.
addams wrote:This forum has some very well educated people typing away in loops with Sourmilk. He is a lucky Sourmilk.

Posts: 1267
Joined: Thu Jun 16, 2011 6:36 pm UTC

I wouldn't play. I would assign values of something like 0 to A, -100 to 0A, and 10 to 5A. So my expected value is <0. (I read the question as A is literally all of my assets.)

Math doesn't tell you to play indefinitely, faulty assumptions tell you to play indefinitely - specifically, that dollars are valued the same no matter how many of them you have.

Superisis
Posts: 48
Joined: Fri Sep 17, 2010 8:48 am UTC

Well, personal assumpsions/values can allow you to do anything (thus beating any action paradox). But if dollars are valued equally at all times (i.e no decreasing marginal utility), then I'd assume that one would always play the game, thus playing unlimited amount of times. But the expected value of playing unlimited times is 0. Hence it's a paradox, no? Or am I missing something?

Posts: 1267
Joined: Thu Jun 16, 2011 6:36 pm UTC

Sure, you can think of it as a paradox. I think it's quite similar to the Arrow Paradox, actually. Infinity messes simple math up.

The expected value of playing the game N times is always >0, no matter how big N is. The limit as N goes to infinity doesn't converge, it just gets bigger and bigger. However, if you calculate it at "infinity", I think you basically get EV=0*infinity, which is just nonsense.

So no, I don't think you can say that the EV of playing infinite times is 0. I'm pretty sure it's either undefined or infinite. Someone who knows calculus could tell us: limit as x goes to infinity of (0.5^X * 5^X) please!

Snark
Posts: 425
Joined: Mon Feb 27, 2012 3:22 pm UTC

Should equal lim(2.5x) as x goes to infinity. Which is infinite.
Dashboard Confessional wrote:I want to give you whatever you need. What is it you need? Is it within me?

Avatar by Matt

Posts: 1267
Joined: Thu Jun 16, 2011 6:36 pm UTC

So what I needed was algebra, not calc.

OK then, the Expected Value (of playing the game forever) is infinite. The only "paradox" is why playing an infinite EV game is obviously irrational, but that's already been explained.

Superisis
Posts: 48
Joined: Fri Sep 17, 2010 8:48 am UTC

While that is true, probability theory also says (unless I've missed something) that any event with a probability greater than 0, if repeated infinite times, will happen. Since not losing a game = 50%, winning x times in a row = (1/2)^x. If x goes to infinity the probability of not losing goes to zero, hence at some point he will lose. If he loses he's lose everything and cannot play anymore. Ergo he expects to lose at some point. If he loses just once the game is over (since he's got nothing more to bet) and he's at 0. Hence expected value = zero making it a paradox. Unless I've done something wrong.

Playing an infinite EV game isn't irrational, though at some point one might want to spend some of that money (but the rest one would continue to invest, otherwise we wouldn't have people like Warren Buffett who has way more than he can spend on consumption).

Snark
Posts: 425
Joined: Mon Feb 27, 2012 3:22 pm UTC

Superisis wrote:While that is true, probability theory also says (unless I've missed something) that any event with a probability greater than 0, if repeated infinite times, will almost surely happen.

Dashboard Confessional wrote:I want to give you whatever you need. What is it you need? Is it within me?

Avatar by Matt

Superisis
Posts: 48
Joined: Fri Sep 17, 2010 8:48 am UTC

Ah. Thanks for the link. So I guess that resolves it.

mward
Posts: 123
Joined: Wed Jun 22, 2011 12:48 pm UTC

As Superisis points out: if you ever lose a bet in this game, you end up broke and cannot play any more. Given that the chance of losing on each bet is 50%, the more times you bet, the greater your chance of ending up broke. After 10 bets, you are broke over 99.9% of the time. After 20 bets you are broke over 99.9999% of the time. The "paradox" is resolved by noting that "probability of losing" is something different from "expected value of playing". In an infinite sequence, the probability of losing can be (almost) 1, even while the expected value is infinite.

Before taking a bet, I would take both factors into consideration: doing so suggests that this bet is not such a good deal.

On the other hand, the opposite bet: where the expected value is negative, but the risk of losing everything is reduced, is attractive to most people (myself included). This kind of bet is called "insurance". This bet is also attractive to the insurance company, because they are not likely to go broke paying out claims, so the positive (for them) expected value is an attraction.

If you think that any bet with a positive expectation is worth making, how about this one:

The pot starts at \$1 and we toss a fair coin. As soon as a tail appears, you win what is in the pot. After every head, the pot is doubled. If you work out the maths, then the expected value is infinite. So you should be willing to pay any amount to play this game.

Anyone willing to play for \$1,000 a go? I am willing to bankroll the game at this level, because even though my expected gain is minus infinity, my probability of actually making a loss is so small as to be not worth considering.

If you still want to play, you might want to check out http://en.wikipedia.org/wiki/St._Petersburg_paradox first.

douglasm
Posts: 630
Joined: Mon Apr 21, 2008 4:53 am UTC

mward wrote:Anyone willing to play for \$1,000 a go? I am willing to bankroll the game at this level, because even though my expected gain is minus infinity, my probability of actually making a loss is so small as to be not worth considering.

If you still want to play, you might want to check out http://en.wikipedia.org/wiki/St._Petersburg_paradox first.

I'd be up for that, with two conditions: No payout is made by either side until I decide to stop playing, and we use computers and probability calculations for however many times I decide to play. I'd be willing to settle for the lowest total amount I have a 99% or higher chance of winning.

I imagine the xkcd number iterations of the game should be enough to bankrupt you. If not, I could start adding some of Conway's arrows.

If such attempts to force the expected gain to actually happen are forbidden, then I'll join you on the bankrolling side.

mward
Posts: 123
Joined: Wed Jun 22, 2011 12:48 pm UTC

I'd be up for that, with two conditions: No payout is made by either side until I decide to stop playing, and we use computers and probability calculations for however many times I decide to play. I'd be willing to settle for the lowest total amount I have a 99% or higher chance of winning.

No problem with settling up at the end: just as long as you don't bet more than you can afford. You'll have to settle up immediately if your debt ever gets within \$1000 of your total assets.

mward
Posts: 123
Joined: Wed Jun 22, 2011 12:48 pm UTC

Given a casino with total assets A and a gambler with total assets B, where the gambler plays the St Petersberg game against the casino, until one or other is bankrupt: what is the price to enter such that the probability of the gambler "breaking the bank" equals the probability of the gambler losing?

Turtlewing
Posts: 236
Joined: Tue Nov 03, 2009 5:22 pm UTC

The expected value function given in the OP is misleading because it only calculates the expected value of one round.

In the iterated game the expected value quickly goes to "you'll loose everything" (remember a single loss at any point means you are bankrupt). When deciding how many rounds to play you need to use the iterated version. Most people's intuition is good enough to realize this even if they can't concisely explain it.

douglasm
Posts: 630
Joined: Mon Apr 21, 2008 4:53 am UTC

By the usual definition of "expected value" - the weighted arithmetic mean of outcomes - even the iterated game has a high and quickly increasing expected value. Yes, you are almost guaranteed to lose everything, but if you get lucky your winnings will be truly ridiculously huge.

It's very similar to a major real world lottery, except with a jackpot in the trillions of dollars. If you had enough money to buy millions of tickets, you could actually reliably make a lot of money by playing such a lottery because a single win would more than pay for every single ticket you bought. The OP game limits you to a single ticket and sets the price of that ticket at "everything you own", but neither of these factors changes the fact that average winnings are positive. Human intuition just tends to discount the jackpot by regarding the probability of winning it as equivalent to 0.

BlueSoxSWJ
Posts: 25
Joined: Tue Apr 17, 2012 4:09 am UTC

Turtlewing wrote:The expected value function given in the OP is misleading because it only calculates the expected value of one round.

In the iterated game the expected value quickly goes to "you'll loose everything" (remember a single loss at any point means you are bankrupt). When deciding how many rounds to play you need to use the iterated version. Most people's intuition is good enough to realize this even if they can't concisely explain it.

First, unrelated: Lose not loose.

Second, related:
There are two meanings of utility in the economic sense, and both apply here. The first, as mentioned, is the declining utility of money. Even if a bet has a strictly positive expected value measured in cash, it may not have a positive expected value measured in value to the player. Second, the utility of an investment is a function of that investment's expected return, the investment's variance, and the investor's level of risk-aversion. In other words, even if we don't scale to the point where the declining utility of money is relevant, we typically still require a positive expected value in order to take on risk (the existence and profitability of casinos notwithstanding). Even if our original game starts out with a one dollar bet, the variance grows much faster than the expected payoff.
1 iteration: EV = +2, σ2=18
2 iterations: EV = +8, σ2=324
3 iterations: EV = +26, σ2=5832 (values measured from start of game, not each round)

A gambler may take a big risk for the thrill or for the excitement of the game. But the "typical investor," even at starting stakes of \$1, will quickly reach the point where the utility of the "investment" in the game is negative*, and thus won't play.

* (In a truly theoretical economics world, the investor won't wait for utility to turn negative not to play; he'll stop playing when the utility of the game falls below the utility of his current best alternative investment opportunity)

Edit: whimsical question - the question just says "bet all his money," but we're interpreting this to mean all the gambler's assets, not just the cash on him. The "value" of this game changes drastically if we assume different meanings to "bet all his money." If it means just the money in his pocket, he should almost certainly play. If it means the total value of all his cash in bank accounts, he probably shouldn't play, but it depends on how his total assets are distributed between various investments. If he has low cash in his bank accounts but a lot of equity in real estate, he probably plays, and can replenish the bank accounts with a home loan if he loses. If it means all his assets, this game is probably not only a bad bet (from a risk-reward utility perspective), but how bad depends heavily on the gambler's total debt outstanding, since that doesn't go away. Unless of course, it means his total net worth (assets minus debt outstanding). So, if that's the meaning, and the gambler is in net debt, does he "win" by losing the game?
Last edited by BlueSoxSWJ on Sat May 19, 2012 9:22 pm UTC, edited 1 time in total.

Tass
Posts: 1909
Joined: Tue Nov 11, 2008 2:21 pm UTC
Location: Niels Bohr Institute, Copenhagen.

Adam H wrote:I wouldn't play. I would assign values of something like 0 to A, -100 to 0A, and 10 to 5A. So my expected value is <0. (I read the question as A is literally all of my assets.)

Actually even if the only amount I was allowed to bet was all my financial assets (not health and education, mind), being young, I think I would play once.

If I lost it would be a set back, but most of my productive years are ahead of me, most people my age does not have the savings that I have anyway. If I won I'd almost be set for life.

mfb
Posts: 950
Joined: Thu Jan 08, 2009 7:48 pm UTC

douglasm wrote:It's very similar to a major real world lottery, except with a jackpot in the trillions of dollars. If you had enough money to buy millions of tickets, you could actually reliably make a lot of money by playing such a lottery because a single win would more than pay for every single ticket you bought.

A lottery with positive expectation value? Where?
And where does it get the money from?

elasto
Posts: 3760
Joined: Mon May 10, 2010 1:53 am UTC

mfb wrote:A lottery with positive expectation value? Where?
And where does it get the money from?

It's when lotteries have rollovers due to noone winning one particular week, so the prize money rolls over into the following week.

(When calculating the expectation you have to take into account the probability of splitting the win with others, however, which sometimes people fail to do.)

mike-l
Posts: 2758
Joined: Tue Sep 04, 2007 2:16 am UTC

elasto wrote:(When calculating the expectation you have to take into account the probability of splitting the win with others, however, which sometimes people fail to do.)

Not really, as long as you are picking your tickets randomly, then each ticket has the same EV. So you just have to compare total payout to total receipts. Lotteries in North America at least are required to post their payout information. So Lotto Max for example is 48% of receipts + any carry over. So as soon as the carryover exceeds 52% of ticket sales, the EV is positive. Now every time the carry over occurs, ticket sales go up, so you do need to take that into account. In an average Lotto Max drawing, about \$1.08 from every \$5 goes to the largest prize pool, or 21.6%, which is what's generally carried forward. (Lower amounts technically can be but it's very rare for that to occur.) So after 3 carry forwards with constant ticket sales the EV would be positive.

In general, the total number of tickets bought in the lotteries carried forward to this one must be at least 2.4x the number of tickets that will be sold to make for positive EV.
addams wrote:This forum has some very well educated people typing away in loops with Sourmilk. He is a lucky Sourmilk.

mward
Posts: 123
Joined: Wed Jun 22, 2011 12:48 pm UTC