Jeremy Zukerman
Posted on Thursday, 20 May, 2004 - 04:09 pm:

My question is about the st. petersburg coin flip paradox. My question is what happens if you play the game twice compared to playing the game once? Will you win twice as much money as if you play the game once? If so, how can that be given that the expected winnings is infinite in both cases?
In normal games of chance, where the expected winnings is finite, if you play a game twice, you normally win twice as much money as if you play the game once. What i am missing?
Basically it boils down to this question: lets' say i play the st. petersburg game once. I count how much money i make.
Then I play the game twice and I count how much money in total I make from the two games.
Then I compare the winnings from the first game compared to the [total winnings of the other two games I played]. Which should be higher? or should they be equal.
My possible answer : 2 * infinity = infinity so they should be equal?
My second possible answer: the expected winnings are the same (infinite), however you should win twice as much money in the total of the twice-played game, then in the once-played game?
What is the correct answer or are both my possible answers incorrect?
explain.
Thankyou very much
Arun Iyer
Posted on Thursday, 20 May, 2004 - 05:30 pm:

What a lovely paradox!!
I just read it here .

Now after reading through that rather long article, i come back to your question. As i understand, maybe you are taking the question in the wrong perspective(or is it that the article i just quoted is wrong).As i understand the article, the question is "A game where u have to just flip a coin. The game ends as soon as u get a tail (or head).If u got tail(or head) after n trials, then you get 2n dollars.Given this data, how much money are u willing to place as a initial bet."

Now it could be that i am understanding your question wrongly, in which case can you clarify your question please?

love arun
Gale Greenlee
Posted on Thursday, 20 May, 2004 - 06:20 pm:

Jeremy: I believe the answer to your questions is simply, "yes". If you play the game twice, instead of just once, I beieve you can "expect" to win twice as much money.

But how much is that?

GALE
Jeremy Zukerman
Posted on Thursday, 20 May, 2004 - 07:18 pm:

From Jeremy: the original poster of this message.
In response to Arun - thankyour for responding: In response to gale: thanks for responding. but Yes, the hyperlink you mentioned is the paradox as well as what you described in your response to my question. But my question is a bit simpler. Forget the part of the paradox where the question arises of "how much are you willing to place as a initial bet". Cut that out of my st. petersburg question. I just want a comparison of how much money is made between the expected winning of playing the game once as opposed to the total expected winnings of playing the game twice. Is the total expected winnings double when you play the game twice. Or is it the same?
I am going to put your explanation of the game (but without the part about the initial bet) here so others can understand the paradox.
St. petersberg paradox :A game where u have to just flip a coin. The game ends as soon as u get a tail (or head).If u got tail(or head) after n trials, then you get 2 to the power of n dollars(2^n). The game expected winnings are infinite. n P(n) Prize Expected payoff
1 1/2 $2 $1
2 1/4 $4 $1
3 1/8 $8 $1
4 1/16 $16 $1
5 1/32 $32 $1
6 1/64 $64 $1
7 1/128 $128 $1
etc.
etc.

n - stands for how many heads in a row
p(n)- stands for the probability of n happening
Prize - stands for how much money you would make if this event happened
Expected payoff refers to how much you would expect to make on average given the probability of this event happening
Note the expected total payoff is the sum of $1 + $1 + $1 + $1 , etc. forever = which equal infinity
Yifan Huang
Posted on Thursday, 20 May, 2004 - 07:41 pm:

Jeremy: It is like the problem of 2*1/0. No matter how small is c, (2a/c)/(a/c) is still 2. I guess it is still true when c is 0.

But I am comfused with Hacking's estimation that "few of us would pay even $25 to enter such a game." What does that mean?

Yifan
Graeme McRae
Posted on Thursday, 20 May, 2004 - 07:55 pm:

To Gale:
Every now and then I play the California state lottery. The probability of winning a small prize, about $5, is maybe 1/20, so this contributes $0.25 to my expected winnings. The probability of winning a medium prize, about $5000, is maybe 1/20000, so this contributes another $0.25 to my expected winnings. The probability of winning the big prize of $5 million is 1/20million, contributing another $0.25.

(These numbers are approximate made-up numbers just to illustrate a point)

Continuing, it's possible that a really, really big prize, say $5 billion could be added, with a correspondingly small probability, which would contribute another $0.25 to the expected payout of one lottery ticket.

And I can imagine a prize after that.

I think I can imagine any finite number of prizes.

But I absolutely cannot imagine an infinite number of prizes, each adding a fixed amount to the expected value of my lottery ticket. There is no paradox. The failure is in me. Even though I think I can imagine an infinite set of integers (at least I can say things about it, such as "if you pick one and show it to me, then I can pick a bigger one and show that to you") but I just can't imagine an infinite set of outcomes, each with a non-zero probability, and each with a fixed expected value.

If my brain were not so limited, then I would "feel" the infinite nature of the St. Petersburg wager, and I would not just know but believe that playing it twice would yield no greater expected value than playing it once.

The article cited by Arun suggests that a person's intuition that the expected value of the St. Petersburg Wager is some finite number because of various kinds limitations in the "utility" of winning some of the larger prizes, or that the game itself is limited by the amount of money in the world, etc. Each of these arguments has counter-arguments, all made very cogently in the article. But (unless I missed it) I didn't see the argument I'm making which explains the paradox: that while any single prize and its associated probability can be imagined, it is not possible (at least not in my brain) to imagine *all* the prizes -- an infinite number -- at the same time. (Thinking back to the set of integers, which I kid myself that I can imagine, I now realize I only imagine it a few at a time, like drops of water in an infinite ocean -- I never find myself needing to make decisions based on *all* the integers simultaneously, because if I had to do that, as I think one has to do to understand the St. Petersburg Wager, I would fail.)

To Arun: thanks for reminding me about the encyclopedia of philosophy.
Stephen Martin
Posted on Friday, 21 May, 2004 - 09:56 am:

Looking at this paradox itself, from the bookie's point of view - assuming it is a lottery situation with millions of ready betters - isn't the bookie going to want a little over $4 from each punter?

Or have I missed the point?

http://plato.stanford.edu/entries/paradox-stpetersburg/

Stephen
George Barwood
Posted on Friday, 21 May, 2004 - 10:26 am:

My answer to the "paradox" would be that before I would play such a game, I would need to determine the credit-worthiness of the other party. How much are they worth?

This is bound to be a finite number, and that will determine how much I would pay.

Or to put it another way, the game is not possible in practice because no bank has an infinite amount of money to offer as a prize.

Here is another simpler game:

I have an infinite amount of money.
You always win the game (and all the money).
How much will you pay me?

Answer: You are lying, you don't have an infinite amount of money!
Michael McLoughlin
Posted on Friday, 21 May, 2004 - 10:47 am:

Can't we consider the expected number of throws, rather than the expected prize? If X is the number of throws, we have

LaTeX Image

So, if you were running the game, wouldn't you put the price for entry at around $4?
Tristan Marshall
Posted on Friday, 21 May, 2004 - 02:27 pm:

There is no 'good' price for entry if you're running this game. Whatever you charge, you'll lose in the long run.

Suppose you charge £k for entry. To each person who plays, you will independently pay an amount Xi , where xi is a random variable with P(X=x) = 2-x . Hence your profit after n people have played is LaTeX Image, or LaTeX Image.

Since the Xi are independent, and LaTeX Image, the strong law of large numbers states that:

LaTeX Image


This in turn implies that our expected profit tends to minus infinity with probability 1, no matter how big we make k.

The same argument applies if we are playing the game, rather than running it. If we play for long enough, we are guaranteed to eventually win as much money as we choose.

However, the strong law only tells us that we will eventually win; it doesn't tell us how long that will take. To carry on an earlier example, suppose the cost of entry is £25, and I have £50. Suppose also that I choose to keep on playing the game until I either lose all my money, or win £10,000,000. In this case it is extremely likely that I will lose all my money rather than win my target.

This is why few of us would pay very much to play this game. Yes, if you play for long enough you are guaranteed to win eventually, but only if you are able to keep on playing forever. If you only have a finite amount of money to start with, you can't keep on playing.
Arun Iyer
Posted on Friday, 21 May, 2004 - 06:12 pm:

I think its a bit different than the way the last few posts seem to interpret.(Possibly i am mis-interpreting it,in which case i would be glad to have it clarified)

Assume that there is as much money in this world as much one can win in this game. Now there are a few ppl who are ready to play the game (big money at stake here u see!).Now the person who is running the game will ask only that person to play who can place the highest initial bet. Given that u are one of those who are ready to play that game, what would be your initial bet? (Also assume that initial bets of each contestant is kept secret from the other contestant i.e no one know what the other person is placing as his initial bet).

love arun
P.S-> This is something similar to hotel infinity riddle. However ofcourse the conditions there had some limitation and a definite answer.
Jeremy Zukerman
Posted on Friday, 21 May, 2004 - 09:32 pm:

From Jeremy Zukerman, original poster of the problem. I still am a bit confused. Seems part of these posts went off on a tangeant a bit. I don't quite see my original question discussed. But, maybe let me rephrase it. Let's say there is an inital dollar amount, I must pay you to play my game. Obviously any finite dollar amount that you picks would not suffice, because the game has an infinite expected winnings. However let's say instead of you picking that initial finite dollar amount to play my game, you choose another method of picking that dollar amount. Instead of picking a finite number, you say: let me pick a number, but you choose this number by a method (as opposed to thinking of a large finite number - which.). This method is the following. You flip a coin until it turn up tails. You count how many heads in a row, and then your value is 2^number of heads (2 to the power of) dollars. Then you do this method again and you get another number. You add the two numbers. Then this total is your intital dollar amount that you want me to pay you so that I can play my game. My game is the same as your game except you flipped twice(two sets of flips), and took the dollar sum. I flipped one set of flips and took the dollar sum. Who is going to be better off? Are you gonna get twice as much as me.?

Here is a concrete example of what I am talking about (although i just chose the numbers at random)


I do my set of flips: I happen to get 4 heads in a row. I have 2^4 = 16 dollars

You do two sets of flips:
On the first set of flips, you happen to get 2 heads in a row giving you 2^2=4 dollars

On your second set of flips, you happen to get 3 heads in row. Giving you 2^3=8 dollars

Now you add the sum of you two sets of flips:
Your total is 4+8=12 dollars

You total of $12 dollars happens to be bigger than my total winnings of $8.

The question is, how do you compare my winning to yours. Will your total winnings on average be twice my amount. Or will they be the same (given that 2*infinity=infinity)?
Hope that explains my question better. Thanks
Graeme McRae
Posted on Friday, 21 May, 2004 - 09:54 pm:

Jeremy, it doesn't matter whether the game is played once or twice -- the expected winnings are infinite in either case.

Where I think some people went off on a tangent was the discussion of real world practical matters. Once those are brought into the picture, then the game is no longer the St. Petersburg game, but some variant of it in which the time allotted to play it is limited, or the amount of money it can pay out is limited. Either of these limitations results in a finite expected winnings, and in that case playing the (modified) game twice will result in twice the expected winnings.

I go back to what I said earlier. My brain is so puny, that it can't really get its neurons around the *actual* St. Petersburg game. Whenever I try to think of it, my little brain conjures up an image of a real-world implementation of this game with all the limitations that implies. That's the reason, I think, for the apparent paradox.

--Graeme
George Barwood
Posted on Friday, 21 May, 2004 - 10:53 pm:

> I don't quite see my original question discussed.

> My question is what happens if you play the game twice compared to playing the game once?

My answer is that you cannot play the game once, let alone twice!
Graeme McRae
Posted on Friday, 21 May, 2004 - 11:23 pm:

George, I disagree with your statement, "you cannot play the game once, let alone twice".

It's true that you can't be guaranteed that a single game of St. Petersburg will end within a pre-arranged period of time. However, there is no game of St. Petersburg that will last forever. So, given enough time, it is possible to play St. Petersburg twice in succession. Or, as an alternative, there's no reason you can't play two separate games of St. Petersburg simultaneously.

Oh! Perhaps you meant that the St. Petersburg game is an idealized game, which, due to real-world constraints couldn't be played as a practical matter. If that was it, then I see your point.

--Graeme
Andrew Fisher
Posted on Saturday, 22 May, 2004 - 10:57 am:

The issue here is how to reconcile the mathematical prediction that we should be willing to pay an infinite amount with the observation that there is an upper limit on what anyone would be willing to pay. To examine the problem in a sensible way, we must consider utility functions. A utility function is a measure of how much we value money. The general assumption is that utility functions are increasing and concave (we prefer more money to less and would choose £ x over a 50 chance of £ 2x and a 50 consider for a moment a separate game. With probability ε, I will pay you £ N(ε). Is it true that for all ε>0, there is an N(ε) for which you would be willing to pay £1 to play the game? If you think the answer to this question is yes, then it is simple to show that you should be willing to pay an infinite amount to play the St. Petersburg game. However, I think it is clear that the answer to that question is no. This suggests that there is in fact an upper bound to your utility. The article linked above says that suggesting utility might hit a maximum is absurd. I would agree with this. However, consider the utility U=K(1- e-x ). This clearly has no maximum (it is strictly increasing for all x), but it is bounded. I suspect nearly everybody has a bounded utility function and that therefore solves the paradox. We have that:

E(U(x))= x=1 K(1- e-x )(1/2 )n <K x=1 (1/2 )n =K, and therefore wewould never be willing to pay more than K.

To return to Jeremy's original question, the problem is that infinity is a very hard concept. (If anybody is worried that it might take infinitely long to play the game then it can be rephrased as simply randomly generating a real number between 0 and 1 and having the payoffs with the same probabilities as before.) Now it is true that if you play this game once and I play it twice, I will have, on average twice as much as you. However on average your winnings will be greater than any finite number you might think of. That is what it means for your expected winnings to be infinite.

Andrew Fisher
Posted on Saturday, 22 May, 2004 - 11:15 am:

The end of my first paragraph above is not correct. However, using relatively simple analytic techniques, we can bound
E (U(X)) by something below K and therefore find a Y such that U(Y) = E (U(X)). Therefore, we could not be willing to pay more than Y.
George Barwood
Posted on Saturday, 22 May, 2004 - 12:32 pm:

> The issue here is how to reconcile the mathematical prediction that we should be willing to pay an infinite amount with the observation that there is an upper limit on what anyone would be willing to pay.

Surely the reconciliation is that no real bank has an infinite amount of money. And the rules clearly require this. So the model does not reflect reality in an essential way, and hence is useless for prediction.

You don't need utility.
Gale Greenlee
Posted on Monday, 24 May, 2004 - 04:15 pm:

May we revisit this a little? Do I understand the game correctly? To play the game one simply places a "bet" and then flips a coin over and over until it come up tails? So, you always "win" the amount bet raised to the power of the number of flips required? The only question is how much do you win? Or, is there someway to lose? Do I have the rules right?
Gale Greenlee
Posted on Monday, 24 May, 2004 - 05:29 pm:

Scratch my last entry. I re-read the rules. I don't bet anything. I simply pay an entry fee to play the game and will win one of the prizes as follows (but my entry fee is not returned):
1 flip $2
2 flips $4
3 flips $8
etc.
etc.

So, if I pay an entry fee of $4 and get a tail on the second flip, I get $4 which is a break even?

Or, is it that if I pay an entry free of $4 and get a tail on the second flip, I get $4 plus my entry fee of $4, for a total of $8, which is a profit of $4?

I guess my basic question is, do I get my entry fee back plus $2 to some power? Or do I just get the $2 to some power?
Arun Iyer
Posted on Monday, 24 May, 2004 - 06:00 pm:

Gale,
u just get the $2 to some power.

In response to Paradox,
==================
John and Jim were at a bar. Jim thought of a plan to get John to pay his drink. Jim claimed that he can count the no. of stars in the sky. John said that is impossible. John bet Jim a drink on this. Then Jim said "I say that every measurement made from large distance is highly unreliable and i also say that any measurement inside a finitely bound system is consistent within that system.Do you agree?". John ,after some thought, said "i think that's quite obvious". Jim's eyes sparkled and he went out and came back after some half and hour and said there are 93 stars. John bought him a drink.
======================
Ofcourse i am supporting Graeme's post.

How much will i bet? I will bet 2$.(Not much to lose)

In response to Jeremy's question,
Your expectation won't change. As to why i again support Graeme's comments on this.

love arun
Gale Greenlee
Posted on Monday, 24 May, 2004 - 08:05 pm:

ok I'm taking Arun at her word. I just get $2 to some power. So if I pay an entry fee of $20 and get tails on the 1st, 2nd, 3rd or 4th flip I'll have a loss of $18, $16, $12 and $4. If I get the tail on the 5th flip or later, I'll make a profit of $12, $44, $108, $236 etc.

In this instance, if tails happens on the 8th flip my profit is $236, if I had only paid $6 instead of $20 my profit would have been $250.

I think you should never pay more than $4. Why pay more when you can win more by paying less?

What am I missing?

GALE
Graeme McRae
Posted on Monday, 24 May, 2004 - 09:19 pm:

Gale, one of the many questions that surround the St. Petersburg game is why many people feel that they would be paying too much to enter such a game if they paid more than about $25.

George Barwood summed it up well "The issue here is how to reconcile the mathematical prediction that we should be willing to pay an [unlimited] amount with the observation that there is an upper limit on what anyone would be willing to pay."

So, Gale, would you be willing to pay, say, $100,000 to play this game?
Andrew Fisher
Posted on Monday, 24 May, 2004 - 09:25 pm:

Can I claim the credit for that quotation...
George Barwood
Posted on Monday, 24 May, 2004 - 09:39 pm:

Gale,

The idea is to ask how much would you be willing to pay, say in an auction.

Some people consider it a paradox that the "expectation" - that is how much you win "on average" - is infinite/undefined, but in practice a typical "well-informed" person would not be willing to pay all that much to play.

My explanation is that such a person would know that the prize fund is finite ( say < 1000 billion dollars ) and so the true expectation is not all that great.

For a billion dollar prize fund the average payout would be about $30 (log base 2 of the prize fund ).

Most rational people would not want to pay even this much, and I would accept that utility functions can be used to give an explanation of this "second order" effect.

Mind you, many people play the lottery...

George
Graeme McRae
Posted on Tuesday, 25 May, 2004 - 05:51 am:

Andrew, yes, I see now that George was quoting you.
Gale Greenlee
Posted on Tuesday, 25 May, 2004 - 03:28 pm:

Thanks everyone. I'm not being deliberately dense, but, I really don't understand this game. I'm supposed to make a rational judgement on "how much I would pay to play the game". I need to understand the rules first so I can make this judgement. OK, if I pay $20 to play the game and get a tail on the second flip I get back 2^2=$4, the game is over and I have lost $16, right?

GALE
Graeme McRae
Posted on Tuesday, 25 May, 2004 - 03:37 pm:

That's right.
Gale Greenlee
Posted on Tuesday, 25 May, 2004 - 03:47 pm:

Thanks Graeme; Then to give a logical answer to the question, "How much should one be willing to pay", wouldn't you simply need to know the "expected number of tosses required to achieve tails"? I think that number is 2 (I could be wrong). Now, 2^2$=$4, so thats the absolute upper limit. So a logical person would never "bid" more than $4 to play the game. If you pay more than $4 you are going to lose money.
GALE
Graeme McRae
Posted on Tuesday, 25 May, 2004 - 03:56 pm:

The expected winnings from the game is the sum across all outcomes of the product of the probability of a given outcome and the amount you would win.

In this game, the sum is has infinitely many terms (because there are infinitely many possible outcomes). The sum is

(1/2)(2) + (1/4)(4) + (1/8)(8) + ...

So the expected winnings from this game is infinite. I recommend that you follow the link given in the second message of this thread, which is a very well-written article that discusses this apparent paradox at length.
Gale Greenlee
Posted on Tuesday, 25 May, 2004 - 04:24 pm:

Graeme: That's simply not true. You did not reduce the expected winnings from the game by the price of the game. We agreed above that you do not get the fee back, only the 2^n.

GALE
James Cranch
Posted on Tuesday, 25 May, 2004 - 04:32 pm:

yes, Gale, but I think he was assuming the price of the game was finite (and thus negligible compared to the expected winnings).
Gale Greenlee
Posted on Tuesday, 25 May, 2004 - 04:36 pm:

Yes, James, and that's the flaw in the analysis. The price is not negligible.

GALE
Gale Greenlee
Posted on Tuesday, 25 May, 2004 - 04:43 pm:

The emperor has no clothes.
Graeme McRae
Posted on Tuesday, 25 May, 2004 - 05:04 pm:

Gale,

If p is the price of entering the game, then the expected winnings are

-p + (1/2)(2) + (1/4)(4) + (1/8)(8) + ...

which is still an infinite number.

--Graeme
Gale Greenlee
Posted on Tuesday, 25 May, 2004 - 05:37 pm:

Graeme: If p is the price of entering the game, then the expected winnings are -p + 2^n, where n is the expected number of tosses required to achieve "tails". n is not infinite, therefore the expected winnings are not infinite, in fact they may not even be positive. If you can tell me what n equals I think I can tell you what the expected winnings are. Does n=2? Or is it 3?

GALE
David Loeffler
Posted on Tuesday, 25 May, 2004 - 05:51 pm:

Gale, it is not true that if X is a random variable and f is a function, the expectation of f(X) is f(expectation of X). The expectation of the number N of throws to achieve a 'tails' is 2. However the expectation of 2N is infinity.
Gale Greenlee
Posted on Tuesday, 25 May, 2004 - 05:57 pm:

David, I'll take your word for the statemen,"it is not true that if X is a random variable and f is a function, the expectation of f(X) is f(expectation of X)". But, that's not the point. If n is 2 as you say, 2^n is 4 not infinity.
GALE
David Loeffler
Posted on Tuesday, 25 May, 2004 - 06:03 pm:

Yes, if n is constant and equal to 2. But we are dealing with the expected payoff of the game; and the payoff is 2N , so the expected payoff is E(2N ). What I am saying is that this is not 2E(N) . The first is infinity, the second is 4. So the expected payoff is infinite.

David
David Loeffler
Posted on Tuesday, 25 May, 2004 - 06:09 pm:

Just to illustrate that point: suppose that I propose another game. You give me 1 cent to enter. If 'tails' comes up on the first or second throw, then you win $1. If it takes longer than that to come up, you have to give me $1,000,000,000.

Then your expected gain is -0.01 + (3/4) x 1 + (1/4) * (-1000000000), or -$249,999,999.26, which makes it rather clear that you wouldn't want to take me up on this.

The fact that the expected number of throws taken to come up tails is 2 is a red herring; it doesn't mean that your expected gain should be $1.

David
Gale Greenlee
Posted on Tuesday, 25 May, 2004 - 06:18 pm:

No, the payoff is the variable and is anything from $2-p to infinity. The expected payoff is $4-p

You guys are the mathematicians so it's foolish for me to quarrell. I'll simply stand by my statement that a reasonable person would not pay more than $4 to play the game. For if he does its more likely that he'll lose than win.

GALE
Andrew Fisher
Posted on Tuesday, 25 May, 2004 - 08:21 pm:

Gale,

Let me first explain that all of us posting in blue are trying to help you understand. We get no reward for doing this other than hopefully helping some people to improve their mathematical skills.

Those of us posting in blue are acting as teachers. Therefore, you should try to understand what we tell rather than simply disagreeing with it. It is true that teachers are sometimes wrong but that is not the case here. Everything we have said can be proven . That means is is certainly true.

Now we want to try and help you understand but it is not clear to me where you are having difficulty. Do you understand the notion of expected value? David has explained above that E(2N ) is not the same as 2E(N) . It is true that if this were the case then you would be right that nobody should pay more than $4. However, it is not true. Also, you have said "the payoff" is the variable. I'm afraid this is also untrue. You must have misunderstood the game. Again, we can try and help you but please don't just tell us we are wrong. I can assure you we are not - we can prove that we are right.

If you explain where in our reasoning you fail to understand then we can try and help. Otherwise, there is little we can do to help.

James Cranch
Posted on Tuesday, 25 May, 2004 - 08:24 pm:

Gale: The problem is that "expected" is a technical term in mathematics. It doesn't just mean what you hope it means.

In particular, it is perfectly possible that the expected value of a game can, and frequently does differ from the value that you're "more likely to win than not", to paraphrase what you said.

Ultimately, I don't see what you can gain from this debate if you can't use the words we use with the same meaning as when we use them.
Andre Rzym
Posted on Tuesday, 25 May, 2004 - 09:15 pm:

Hello Gale,

I think there are a couple of issues that are muddying the water here. If you are interested, maybe we could digress onto a simpler game:

Suppose I roll a dice (the usual unbiased six sided variety ?). If the outcome is x (i.e. in 1..6), I pay you $x. So I pay you $4 if I roll a 4 etc.

A couple of questions:
a) x is in the range 1..6. What is the average value of x?
b) what would you pay me upfront (which I keep) to play such a game?

Now consider a variant of the game. I pay you the square of the dice outcome. So, for example, I pay you $16 if I roll a 4. Again a couple of questions:
a) x is still in the range 1..6. What is the average value of x?
b) what would you pay me upfront (which I keep) to play such a game?

Andre
Gale Greenlee
Posted on Wednesday, 26 May, 2004 - 04:11 pm:

Thanks for the posting Andre. I guess my answers would be:
a) $3.5
b) $1.5

a)$15 + 1/6$
b) $10.5

Thanks for helping me out here.

GALE
Andre Rzym
Posted on Wednesday, 26 May, 2004 - 05:51 pm:

The first (a) is correct (although it?s just a number, not a USD amount). I assumed you computed
1*1/6 + 2*1/6 + 3*1/6 + 4*1/6 + 5*1/6

For the first (b), how did you arrive at $1.5? Statistically, you will receive $1 with probability 1/6, $2 with probability 1/6, etc. Therefore, probabilistically, you will receive $1*1/6+$2*1/6+$3*1/6+$4*1/6+$5*1/6+$6*1/6=$3.5. Wouldn't you therefore pay $3.5 to play?

Andre
Gale Greenlee
Posted on Wednesday, 26 May, 2004 - 06:03 pm:

Andre: Not much logic in my $1.50 answer. I just noted that at $3.50 buy in, there are 3 amounts I could win or lose; 2.5 or 1.5 or .5 which averages at 1.5

For the squares, and with a 15 1/6 buy in I could win 5/6 or 9 5/6 or 20 5/6, for an average of 10.5. The possible losses are 14 1/6 or 11 1/6 or 6 1/6, which also averages 10.5

I guess 3.50 and 15 1/6 would be more logical.
Andre Rzym
Posted on Wednesday, 26 May, 2004 - 08:48 pm:

$3.5 and $15 1/6 sound good to me.

As further justification for the first result ($3.5) you may note that 1 and 6 come up equally likely with payouts $1 and $6 -as average of $3.5. The same average payout holds for 2 and 5 and similarly for 3 and 4.

Another way of justifying both results is to imagine 6000 rolls. Each number comes up equally often (1000 times) on average so you would expect the payout to you to be (for the first game)

1000*$1+1000*$2+...+1000*$6=$21,000

That?s what you would expect to earn on 6000 rolls so per roll you would expect to earn $21000/6000=$3.5

The same computation for the second would be

1000*$1+1000*$4+...+1000*$36=$91,000 and $91,000/6000=$15 1/16

We have just looked at two games here, but there are clearly many more I could propose (e.g. I could cube the number on the dice -check that the game is now worth $73.5). The computations would be the same. I would like to propose the following 'recipe' for computing the value of the game:

(i) List all the possible outcomes (e.g. a dice can take the values 1..6).
(ii) For each outcome in (i), write down the probability of the outcome.
(iii) For each outcome in (i), write down the payoff (i.e. what I pay you)
(iv) For each outcome in (i), multiply the probability in (ii) by the payoff in (iii) and add them all together.

The result (i.e. the sum) is the value of the game.

Do you agree this sounds correct, and is consistent with the 3 games we proposed above (worth $3.5, $15 1/6, $73.5)?

Andre
Gale Greenlee
Posted on Thursday, 27 May, 2004 - 04:15 pm:

Andre: I agree that it sounds correct but I must be cautious. I am convinced this probability business is a slippery slope and we have to be particularly careful when considering an infinite number of possible outcomes. We cannot do what you suggest. We can't list the possible outcomes in the St Petersburg puzzle. We can think about it, devise symbols to represent it but we can't list them. You are aware of the many probability puzzles which involve infinity, they get really messy. For example the random chord puzzle has at least three different solutions. I digress.

I think the problem lies in your (iv)(iii) step where we all them all together. Let me give my little analysis and then discuss it some more.
Here goes.

Let's play the $2^n payoff game and pay $8 to play it. We are either going to win money or not win money. The probability of winning plus the probability of not winning must equal (1).
First the ways to not win (three ways). The probability of losing $6 is 1/2. The probability of losing $4 is 1/4. The probability of a push is 1/8. Add them up, 1/2 + 1/4 +1/8=7/8. So, the probability of winning is 1 -7/8 = 1/8. Let's schedule the outcomes which make up the 1/8.

......p.......$.......extension
.....1/16....16........$1
.....1/32....32.........1
.....1/64....64.........1
.....1/128...128........1
.....etc.....etc.......etc
sum..1/8.....infin.....infinity

What is the value of this unending list of $1 possibilities? They are mutually exclusive articles. Since only one of them can happen I believe the value is $1. I know we'll part company right here becaue you will say the value is infinity.

Assume for a moment that I'm correct. You can at least imagine such a thing. If I pay $8 for a 1/8 chance to win a prize that's only worth $1, I paid too much because the ticket is only worth 12.5 cents. How about paying $4? That's too much also. A 1/4 chance to win a prize thats only worth $1 is worth 25 cents. We are getting closer. Another chart:

$8.....12.5 cents
4.... 25 cents
2.... 50 cents
1.....$1

There we are. It's $1. But thats not gambling, we were guaranteed a prize of at least $2, so lets round it off at $4.

What a mess!
James Cranch
Posted on Thursday, 27 May, 2004 - 04:35 pm:

Modern probability theory was invented to deal with questions where the number of different possible outcomes is infinite. It needn't be messy at all.

Furthermore, I suggest that your example of the random chord puzzle is spurious. The reason one can get three different solutions is that one can choose chords randomly in several different ways.

In this situation, the method for choosing the outcome of the game is forced: the coin tosses happen independently at random with probability 1/2 each way, and that's a complete and unique description of the probabilistic input.
James Cranch
Posted on Thursday, 27 May, 2004 - 04:38 pm:

Furthermore, real life probability questions happen everywhere: for example, there are infinitely many places a dart can land on a dartboard.

Again, we have an advanced and rigorous probability theory to deal with this.
Gale Greenlee
Posted on Thursday, 27 May, 2004 - 04:40 pm:

More from Gale: Notice if we play an $8 we paid $3 too much by this analysis.

Non-wins. . .-$6...w/p1/2=-$3
.............-$4...w/p1/4=-$1
.............-$0...w/p1/8=.$0
Total.....................-$4

Wins..........value........$1

How much to much?..........$3

Correct price.........$8-3=$5

But noone would pay amounts other than the squares $1,4,8,16 etc. You get just as much value paying $4 as paying $5 so the correct price is not $5, it's $4.

GALE
Andre Rzym
Posted on Thursday, 27 May, 2004 - 04:43 pm:

Clearly you have an eye to where I am leading you!

But for the time being, could we just stick to finite outcomes? Under those circumstances would you agree that the 'recipe' that I presented makes sense?

(Incidentally I will be away for a few days soon, so in the meantime anyone please feel free to chip in or I'll continue on my return).

Andre
Graeme McRae
Posted on Thursday, 27 May, 2004 - 04:54 pm:

Imagine a new game, called the London game, which is just like the St. Petersburg game, except for one thing: before the game starts, there is an initial coin flip. If the coin comes up tails, you get $2, and you don't get to play any more. But if the coin comes up heads, then you play the St. Petersburg game, except all the payouts are doubled.

The London game seems a bit better than the St. Petersburg game, doesn't it? After all, you have a 50% chance of winning $2, and a 50% chance of winning twice as much as whatever you would win in the St. Petersburg game.

The expected winnings of the London game is clearly $1 more than the expected winnings of the St. Petersburg game, because if the initial coin-toss comes up tails (a 50% probability) the payoff is $2, so the expected value of this leg of the probability tree is $1; and if the initial coin-toss comes up heads (a 50% probability) then the payoff is twice that of the St. Petersburg game, so the expected value of this leg of the probability tree is equal to that of the St. Petersburg game.
Gale Greenlee
Posted on Thursday, 27 May, 2004 - 05:08 pm:

Yes Andre it makes very good sense. And thank you for your responses. It's great fun to think about the infinity questions. To James: I agree, but you simply need to be careful that you are using an advanced and rigorous approach. Consider the dart board you mentioned. There are an infinite number of points on the board, but we all know they are not equally likely to be hit by a dart. Points are more likely to be hit if they are near the bull. Likewise consider covering a dartboard with a checkered cloth constructed as follows: 3x3 inch white squares each with a 2x2 inch green square centered within. The portion of the board which is green may or may not be 4/9. My point is simply that you must be careful.
gale
Gale Greenlee
Posted on Thursday, 27 May, 2004 - 05:23 pm:

I like Graeme's London game even better. How about my Paris game? We play it just like the St Petersburgh game except we limit the number of possible tosses to 16. Here is how we will impose the limit. We have 16 coins and one of them is tails on both sides. We mix the coins and use a new coin at random (without examining it) for each flip. So we can't possibly have a run of 16 heads.

GALE
James Cranch
Posted on Thursday, 27 May, 2004 - 05:41 pm:

I'm always careful, I'm a pure mathematician.

As it happens, when I play darts, the points near the bull are less likely to be hit than the outer ones...
Graeme McRae
Posted on Thursday, 27 May, 2004 - 06:50 pm:

Gale, I'm glad you like my London game. Do you agree that, over the long haul, the London game pays about $1 more than the St. Petersburg game?
Gale Greenlee
Posted on Thursday, 27 May, 2004 - 07:31 pm:

Graeme: Yes I do agree basically because of the infinite series of winners being at $2 instead of $1.

GALE
Graeme McRae
Posted on Thursday, 27 May, 2004 - 08:48 pm:

Gale, since you have agreed that the expected payout of the London game is greater than that of the St. Petersburg game, I should tell you that the London game is identical to the St. Petersburg game; the rules are simply phrased differently.

--Graeme
Peter Conlon
Posted on Thursday, 27 May, 2004 - 11:38 pm:

Jeremy, if you're still following this conversation, I return to your original question in this post.

Might as well add my thoughts to the confusion.

I *would* pay £4 to play the St Petersburg game.
I *wouldn't* pay £4000 to play a variant of the St Petersburg game (Leningrad game) where the prizes are 4000*2^n. The reason for this is to do with how much I'm prepared to lose etc, and this reasoning is different for everyone. (To do with utility?) Mathematically speaking, there isn't a difference between the £4 entry St Petersburg game and the £4000 entry Leningrad game, so I don't think we can hope to explain that aspect of my decision mathematically.

I just said I wouldn't play a £4000 entry Leningrad game, even though the expected (in mathematical sense) return is infinite, (and 4000 times larger than the St Petersburg Game.)

Actually, I would play the £4000 entry Leningrad game, but only if they let me play more than once. I would happily take out a £2million loan, and spend it all on the leningrad game, confident in the knowledge that I would very probably come back with enough money to pay back the 2million, pay all the interest on the 2million which i would owe for the 10hours or so that I had borrowed the money, and have quite a bit left over for my own enjoyment.

To avoid muddying the waters (even more) i'll return to St. Petersburg.

My decision about how much I'd be prepared to pay to play the game depends on how many times I'm allowed to play. Since the expected payback is infinite (no dispute), it is pretty useless in helping me make up my mind, so we need to formulate some other means for a rational person to use in choosing an entry price.

My decision is mostly affected by what I think is most likely going to happen.

If I'm allowed to play only once, I wouldn't pay more than about £4. Half the time I'll lose £2, a quarter of the time I'll break even, and the rest of the time I'll win. (Unfortunately the figure of £4 is just plucked from the air. I suppose 2^E(n) = 4 is the best justification.)

However, as the number of games I intend to play increases, so does the amount I'm willing to pay per game. This is because I can work out what's most likely to happen:
application/pdfSt. Petersburg
paradox.pdf (187 k)
,
In the .pdf file here I've got excel to plot the probability distribution of [the average pay back per game when playing n games].
The sample size for each plot is 1000, and the results have been grouped in integer bin-widths.

The .pdf file has got 9 distributions in it corresponding to n=1,2,4,8,100,500,2000,10000,40000.

That may sound a little confusing, so I'll say it again:
1.I play n (£0-entry) games of the St. Petersburg game. After playing n-games, I look at my total earnings and divide by n. Call this X.
2.Repeat step 1 1000 times, and plot the distribution of the Xs obtained.

I've only bothered plotting upto 50, and these are empirical results, not calculated.

Things worth noting:
The expected (in the mathematical sense) payback per game is always infinite. - not obvious from the distributions, but true non the less.
The expected (in the english sense) payback per game varies with how many games you play.

A proper mathematical translation of the english version of "expected" is a bit tricky, so I'll just use the mode for now, i.e. the most likely outcome.

The mode varies with the number of games you play.
If I play four games (n=4), the most likely outcome is a total payback in the vicinity of 12, or £3 per game. Therefore I wouldn't pay £4 per game to play 4 games, because the most likely single outcome to happen is for me to lose about £3. Of course, I might risk paying £4 per game nonetheless, on the chance that I'll strike it lucky, and make quite a bit of money.
For n=100
I would very happily pay £5 per game, but I'd be most disgruntled to pay £13/game.
For n=40000, I would very happily pay £13 per game.

Using the mode is of course very simplistic. (Since it is often more likely that something other than the mode will happen)
Using the mean is crazy (infinity)
Using the median might be okay, but it'll have it's own problems.

If anyone can formulate the distributions as a function of n it'd be great, but I think it's too hard.
--------
So before I get attacked:
I know the expected value is infinite. I am trying to resolve the paradox. A rational person would not pay an arbitrarily large amount of money to play the St. Petersburg game.
--------

Jeremy's original question:
What happens if you play the game twice, will you win twice as much?

To answer this by appealing to an infinite expected value is inadequate. If I play the game once, I do not get infinite cash.

The distributions for n=1 and n=2 are in my .pdf, but they're not particularly helpful.
Look at n=10000 and n=40000
If I play the game 10000 times, the most likely single outcome is around £13/game return, so I would get £130,000 in this case.
If I play the game 40000 times, i.e. four times more, the most likeyl single out come is around £16/game return. so I would get £640,000 in this case. (i.e. more than 4 times what I'm likely to get playing the game 10000 times).

So my answer to Jeremy's question is that playing the game twice is likely to earn you more than double what you would earn playing the game once.
(I know loads of people will disagree, since the expectation is infinite.)
Definitely my longest post.

----
As an aside, the transformation of the geometric n=1 distribution to a gaussian distribution for large n is kinda visible. Central limit theorem still seems to work. As n gets incredibly large, the centre of the bell shape seen in (n=40000) distribution will head of to infinity.

Err.
Just re-read all of that. It's a bit long.
Ah well, might as well post it now I've written it.

Peter

Feel free to argue.
Gale Greenlee
Posted on Friday, 28 May, 2004 - 03:07 pm:

Back to the math involved. John Maynard Keynes (of Cambridge) wrote an article titled "The Application of Probability to Conduct". In this article he deals with the "Petersburg Paradox" but, states the problem in a slightly different way. The payout is in shillings, 1 for heads on the first toss 2^(N-1)for each subsequent toss. So the first prize is 1 instead of 2 as we've been considering. Otherwise it seems the same except we win with the first head instead of the first tail. I wonder if the games are identical? I think not. I've thought all along the correct buy-in for the game we've been discussing is 4, so, it seems for the version Keynes mentions the value should be 2 shillings.

Here is a quote from the article. "In the present case, even if we are able to range goods in order of magnitude, yet it does not follow that we can range the products composed of each good and its corresponding probability in this order."

This quotation challenges Andres recipe which I will repeat here;
"(i) List all the possible outcomes (e.g. a dice can take the values 1..6).
(ii) For each outcome in (i), write down the probability of the outcome.
(iii) For each outcome in (i), write down the payoff (i.e. what I pay you)
(iv) For each outcome in (i), multiply the probability in (ii) by the payoff in (iii) and add them all together.

The result (i.e. the sum) is the value of the game.

GALE
Gale Greenlee
Posted on Tuesday, 01 June, 2004 - 03:18 pm:

If anyone cares I set up a worksheet to compute the value of the game assuming various entry fees and carrying the worksheet to various ending points. Hope that makes sense.

If you pay $50 and carry the table out to 50 coin tosses you get a value of $zero. If you carry it out to 35 coin tosses the value is $15.

When I say value, I mean net value after deducting the entry fee at every toss.


If you pay $13 and carry the table out to 13 coin tosses you get a value of $zero. If you carry it out to 12 coin tosses the value is $1.

Kind of an interesting pattern.

This approach seems realistic to me if you tend to think that very long runs of straight heads not only occur very seldom but for practical purposes occur not at all.

GALE

Gale
Graeme McRae
Posted on Tuesday, 01 June, 2004 - 03:48 pm:

Whenever "practical purposes" creep into the St. Petersburg game, then it ceases to be the St. Petersburg game.
Tristan Marshall
Posted on Tuesday, 01 June, 2004 - 05:00 pm:

Computer simulations aren't that useful in this case. The expected value of the game is infinite because there are very small probabilities of very big wins. However, the probabilities are so small that you are unlikely to see this effect in any finite computer simulation.

On to what Keynes said: you should be aware that Keynes was an economist, not a probabilist, so he might be using slightly different terminology from us.

Firstly, Andre's recipe defines how to find a quantity called the 'Expectation' (note capitals!) of the winnings in this game. This on its own doesn't tell us what a 'fair entry price' for the game should be, but it does tell us about our long-term winnings if we keep playing for 'long enough' (often, 'long enough' is actually 'infinitely long').

I can't quite make out the precise meaning of the quote you listed, since it's rather short and out of context, but to me it doesn't seem to contradict Andre's 'recipe' at all. In any case, Andre's recipe is actually a definition, so you can't really 'contradict' it!

It seems to me that he's talking about something along the lines of the 'maximal utility' approach that Andre was talking about further up the thread. He seems to be making the point that two people can disagree on what they would be prepared to pay in order to play the St Petersburg game.
Gale Greenlee
Posted on Tuesday, 01 June, 2004 - 05:58 pm:

I was only using the computer simulation to gain some insite if possible. I think there are some insites there. One is that the expected value of the game is infinite only if your entry fee is infinitely large. The "net" value seems to be zero if you bet an infinite amount and potentially flip the coin an infinite number of times. The net value ends up being the potential number of times we may flip minus the entry fee. So since no gambler could have an infinite amount of money to pay as an entry fee the expected value will always be infinity minus something. Is that still infinite?
GALE
Vicky Neale
Posted on Tuesday, 01 June, 2004 - 08:28 pm:

I haven't read the situation in detail, so can't comment on the probability aspect, but in answer to your last question, that depends! Strange things happen with infinities... If you're taking away a finite amount, then yes, 'infinity minus it' is infinite (I've put it in inverted commas because it's not really something I should say); imagine you have an infinite number of pineapples. Then if I take away any finite number of them, you're still going to have infinitely many left. Perhaps a better way of thinking about it would be to assume that 'infinity minus something' is finite. Then I could put back finitely many pineapples (the same number that I took away), and get to something that is both finite and infinite. This is clearly nonsense, so I must have had infinitely many to start with.

But you do have to be careful, and you certainly shouldn't assume things about infinity, because very odd things can happen (like there being different sizes of infinity)!

Vicky
Andre Rzym
Posted on Wednesday, 02 June, 2004 - 09:06 am:

Gale,

The discussion has moved on a bit since I last posted, but for the sake of completeness, I'll post my final observations, which is where I was taking our debate:

Firstly, you have agreed with my recipe for computing the value of a game. This procedure, as Tristan points out, is the computation of the expectation of the winnings ('expectation' being used here in the defined mathematical sense). If I understand you correctly, you are cautious about applying the formula where there is an infinite number of outcomes.

So let?s now think about a game where I flip a coin a maximum of 6 times. You get paid 2^n where n is the point at which you first flip a tail. But either way, the game stops after 6 tosses, whereupon you get paid 2^6. What is the fair price of entering such a game (assumed to be equal to the statistical average payout)?

Well the payouts for tails being the 1,..,6th flip are 2,4,8,..,64. The associated probabilities are 1/2, 1/4, 1/8, 1/16, 1/32, 1/32. Note that last probability is 1/32 not 1/64 -that's because we terminate the game after 6 flips, come what may.

Following our recipe, we get a value of the game of

1/2*2 + 1/4*4 + .. + 1/32*64 = 7

Now I agree that this is not the same as the original game. This one terminates. But if you think about it, this game will never payout more than the original game for a given coin-flip sequence. Therefore this game must be worth less than the original game. Therefore the original game must be worth more than $7.

If I modify the game to extend out to, say, 10 rather than 6 coinflips maximum, the worth of the game is increased, but is still less than the original game. Indeed, if you pick a number (say 1,000) I can alter my game so that it is worth more than $1,000, yet is still worth less than the original game. Therefore the original game must be worth more than any number you can write down. And that is why we say that the expected payout of the game is infinite.

Andre
Gale Greenlee
Posted on Wednesday, 02 June, 2004 - 03:58 pm:

Andre: Thanks so much for the posting. I don't understand your figures. It seems to me your numbers don't follow the rules of your game. I'm going to copy the rules here for reference.

"I flip a coin a maximum of 6 times. You get paid 2^n where n is the point at which you first flip a tail. But either way, the game stops after 6 tosses, whereupon you get paid 2^6."

What is the fair price of entering such a game (assumed to be equal to the statistical average payout)?

Well the payouts for tails being the1,..,6?th flip are 2,4,8,..,64. The associated probabilities are 1/2, 1/4, 1/8, 1/16, 1/32, 1. Note that last probability is 1 not 1/64 ? that?s because we terminate the game after 6 flips, come what may.

Following our recipe, we get a value of the game of

(1/2*2 + 1/4*4 + .. + 1/32*32)+ (64) = 69

Not 7.

I would like to continue with this discussion. I have some more things to consider but want to give you time to disagree with the 69, if you do disagree.

GALE
Philip Ellison
Posted on Wednesday, 02 June, 2004 - 04:09 pm:

Gale, probabilities must sum to one. The probability of finishing on one of the first five throws is (1/2+1/4+1/8+1/16+1/32)=31/32. Therefore the probability of the game ending of the sixth throw is the probability that the game didn't end on any of the previous five throws, namely 1-31/32=1/32.
Gale Greenlee
Posted on Wednesday, 02 June, 2004 - 04:33 pm:

Philip and Andre: Yes of course. Sorry for the error. Andre is right. I follow now.

But, I would still like to agrue a point. Maybe it's just semantics. If we play the modified game, pay an entry fee of $7, and compute the value of the NET PAYOUT, we get a value of zero at the 6th toss. As we would expect if the value is $7. No arguement so far.

If we play the original game, and again pay an entry fee of $7, we get a value at the 6th toss of a negative 89 cents rounded. So comparing $0 to -$.89 is seems the original game is less valuable than the modified game. Not more valuable.

Perhaps this is the paradox.
gale
Gale Greenlee
Posted on Wednesday, 02 June, 2004 - 04:56 pm:

Here is my -.89 v.s. zero

-2.50........... -2.5
-0.750.......... -3.25
0.1250......... -3.125
0.56250........ -2.5625
0.781250....... -1.78125
0.8906250...... -0.890625


-2.50......... -2.5
-0.750........ -3.25
0.1250....... -3.125
0.56250....... -2.5625
0.781250....... -1.78125