Shu Cao
Posted on Sunday, 31 August, 2003 - 09:55 pm:

P and Q both inform me in the afternoon that there is a body in my fridge. The probability that P tells the truth is p, that Q tells the truth is q. These are independent. I haven't looked in the fridge for sometime, so if you asked me in the morning, I would have said the probability of there being a body in the fridge was 1/2. In view of P and Q's information, I will have to revise my estimate. Explain why my new estimate should be (pq)/(1-p-q+2pq).
This is the first part of Question 13 in Advanced Problems in Mathematics.
I thought of a shortcut for the given solution:
Let A= both P and Q are both telling the truth;
B= both P and Q express the same opinion. Therefore p(there is a body in fridge)=p(A|B)= (pq)/[(1-p)(1-q)+pq]=(pq)/(1-p-q+2pq).
I am not sure whether it is correct. Please tell me! Thanksx
Chris
Posted on Sunday, 31 August, 2003 - 10:04 pm:

I think this is fine, since you have p(truth)/p(truth)+p(lie).
Shu Cao
Posted on Sunday, 31 August, 2003 - 10:23 pm:

Thanks
Andre Rzym
Posted on Monday, 01 September, 2003 - 07:56 pm:

I don't think I like this proof, although it may just be because I don't understand it.

You say we want p(A|B) : is it obvious why the p(A) be conditioned on P&Q expressing the same opinion (i.e. B)? In fact, it is conditioned on P&Q expressing the same opinion and that the opinion was that there was a body and and that the prior probability of there being a body was 1/2. Indeed, how are you making use of the prior probability of 1/2?

Lastly, I don't see how you go from P(A|B) to (pq)/[(1-p)(1-q)+pq], but again it could well be that I am misunderstanding what you are doing.

Andre
Chris
Posted on Monday, 01 September, 2003 - 08:47 pm:

Andre: P(a|b) = p(a and b)/p(b) = pq/(pq+(1-p)(1-q). Since pq is the probability that they both tell the truth and express the same opinion and p(b) = p(both telling truth)+p(both lieing).

Or maybe I'm being stupid.
Andre Rzym
Posted on Monday, 01 September, 2003 - 09:01 pm:

OK, I'm probably being slow here, but where is the dependence upon the prior probability? In other words, if the prior was 1/3 rather than 1/2, what would change in your formula?

Andre
Chris
Posted on Monday, 01 September, 2003 - 09:03 pm:

Nothing at all, which is kind of peculiar, but if you think, the original probability has no bearing, since his re-estimate is based entirely around what his friends tell him, and the probabilities that they are telling the truth, which of course isn't influenced by his earlier belief.
Andre Rzym
Posted on Monday, 01 September, 2003 - 09:28 pm:

Why should you discard one piece of information just because you have been given a second? Following that argument, suppose P makes a statement after Q. Why don't you discard Q's statement as well as the prior probability?

Looked at another way, suppose your prior was that there was 0% probability of a body, and 100% probability of no body. Intuitively, are you comfortable discarding this surety and using the opinions of a pair of liars instead?

The correct way to treat this problem is to incrementally add information (you can probably do it in a tree, I just like using symbols):

1) Compute pr('body in fridge'|'prior probability')
2) Compute pr('body in fridge'|'prior probability','Q says there is a body')
3) Compute pr('body in fridge'|'prior probability','Q says there is a body','P says there is a body')

The computation of (2) depends upon (1) and the computation of (3) depends upon (2).

I'll fill in the details if you are interested.

Andre
Chris
Posted on Monday, 01 September, 2003 - 09:43 pm:

I agree, the final estimate should be a combination of all 3 persons belief's and the liklehood of each personal telling the truth. However, is it just coincidence that Shu's proof works?
Michael Doré
Posted on Monday, 01 September, 2003 - 11:02 pm:

I agree with Andre. The answer must depend on the prior probability. Imagine that before you heard from your friends you thought the chance of there being a body in the fridge was, say, 1 in a billion. Then your friends (who tell the truth with say probability 1/2) say there is a body. What would you believe? Well obviously you would think it is incredibly unlikely there is a body in the fridge - after all a body being in the fridge is an unbelievable 1 in a billion occurrence, whereas your friends both lying isn't a strange occurrence at all (considering it happens with probability 1/4). So the chance of there being a body should be very small.

But according to the formula the chance is pq/(pq+(1-p)(1-q) with p = q = 1/2, so the answer is 1/2 according to the formula. Therefore the formula cannot possibly hold in this case.

However it is possible to make Shu Cao's argument work, but we must use the prior probability.

Since the prior probability is 1/2 the basic setup is symmetric on interchanging "there is a body" with "there isn't a body". Therefore:

P(P & Q are telling the truth|they both say there's a body)
= P(P & Q are telling the truth|they both say there isn't a body)

This is therefore equal to P(P & Q are telling the truth|they agree). The result now follows from Shu's argument.
Michael Doré
Posted on Monday, 01 September, 2003 - 11:03 pm:

Sorry, I hadn't seen the last two messages when I wrote in. However hopefully my post addresses Chris's point about whether it's a co-incidence that Shu's argument gets the right answer.
Gale Greenlee
Posted on Tuesday, 02 September, 2003 - 03:52 pm:

May I offer my 2 cents worth? You guys are the experts and I'm a novice at best. But, I recall a puzzle which went something like this. "The inhabitants of a certain island lie 2/3 of the time and tell the truth 1/3 of the time. One fellow makes a statement and another fellow says, 'yes, that's correct' ". We are asked to compute the probability the statement is true. They are either both lying or both telling the truth. It's (2/3)*(2/3)= 4/9 for lying. And it's (1/3)* (1/3= 1/9 for the truth. Comparing the numerators it's 1/(1+4)=1/5 for the answer.
But, this 1/5 answer only holds when the a- priora for the truthfullness of the statement was 1/2 to begin with. In other words if the statement was "that coin I flipped came up heads" then 1/5 is a good answer. But if the statement was "that pig can fly", and only 1 pig in 10,000,000 can fly, then the 1/5 does not hold. It's the fact that Shu's original p was 1/2 that is making his formula work. Otherwise it wouldn't.
Shu Cao
Posted on Tuesday, 02 September, 2003 - 04:12 pm:

Yes! I just realized that it only works if both P and Q were false would prove there being a body to be false, and if both P and Q were true would prove there being a body to be true.
Michael Doré
Posted on Wednesday, 03 September, 2003 - 04:53 pm:

Shu: no I don't think you assumed that. If you'd assumed that then you would have got 1 as the answer.

Gale: yes I agree, that's a good analogy. As stated, there's not enough information to solve your island problem.

Just one other comment having read the earlier messages again. It's not actually true that we are conditioning on the prior probability, and it's certainly not true that all three bits of information are on the same footing. If you removed P or Q then that would be one fewer thing to condition on, but it wouldn't fundamentally change the problem. However if you removed the prior probability then there would not be enough information to solve the problem at all.

Michael
Shu Cao
Posted on Wednesday, 03 September, 2003 - 09:51 pm:

I am probably just being stupid, but I don't seem to get 1. Maybe I didn't express myself properly, I meant that my original argument would only hold if I assumed that B was true if both P and Q were true, and false if both P and Q were false. Hence it would only hold if B was either true or false, hence with a probability of 1/2. Where B= there being a body in the fridge. I may be completely wrong though...
Gale Greenlee
Posted on Thursday, 04 September, 2003 - 03:45 pm:

Michael and Shu: I think some "frequentists" (if that's the right term) would argue that we do have enough information to solve the island problem. They might argue that given the tabulation of the results from a large number of experiments concerning the statment made by the first inhabitant and then considering only the instances when the second inhabitant verified his statement, the ratio of true statements to false statement would be 1/5 to 4/5. This, regardless of the statements themselves, because in the long run the men only lie 2/3 of the time regardless of the outlandish nature of the particular statement.

It's hard to argue with this position. And it's the uncertainty about this point which leads to some people saying that Bayes Theorem is a leaky vessel.

GALE
Gale Greenlee
Posted on Friday, 05 September, 2003 - 06:05 pm:

Here is a little thing I wrote about liars. I would enjoy any comments.

"Imagine two dice players each of whom lies 2/3 of the time. Now, when Bob lies about what he tossed with the dice, he lies in this fashion; if he threw 7 he says 8 , if he threw anything else he says 7. On the other hand when Larry lies about what other people threw, he lies in this fashion; he just announces any one of the other possibilities at random.

Bob tosses the two standard dice and reports to us on a secret ballot (secret to Larry, that is) that he threw 7. Both he and Larry can see the dice but we cannot. We, however, are recording the events on film, so we can check later. We then ask Larry to tell us out-loud what Bob threw, and, he reports, 7.

What is the probability Bob threw a 5?

What a mess! How to start? How about the probability Bob threw a 7? Then we will go on from there. Okay, he either threw a 7 or he didn?t.

Seven happen 1/6 of the time
Other things happen 5/6 of the time

If Bob threw a 7 he would admit it 1/3 of the time, so that's 1/6 x 1/3 = 1/18 = 10/180

If Bob threw a non-7 he would report 7 2/3 of the time, so that's 5/6 x 2/3 = 10/18 = 100/180

If Bob threw a 7 Larry would admit it 1/3 of the time, so that's also 1/6 x 1/3 = 1/18= 10/180

If Bob threw a non-7 Larry would lie about it 2/3 of the time and report 7 1/10 of the times when he lied. So, that's 5/6 x 2/3 x 1/10 = 10/180

For both telling the Truth we have 10/180 x 10/180 = 100/32400

For both lying we have 100/180 x 10/180 = 1000/32400

The truth then has it then at; 100/(100+1000)= 1/11 Which is the probability Bob really did toss a 7.

So then the probability he did not toss a seven is 10/11

And if a player did not toss a 7, the probability he tossed a 5 would be 1/6. That's following the 'adjust the sample space', sort of procedure. You know, there are 36 ways the dice can land, 6 of those add to 7, so eliminate those and we have 30 left. And, there are 5 ways to toss a 5, so there you have it, 5/30=1/6 Which is the conditional probability Bob threw a 5, given that he didn?t throw a 7. And so, the probability for the combined event of not throwing a 7 and actually throwing a 5, would be 10/11 x 1/6 = 10/66 =5/33

Right?"

GALE
Michael Doré
Posted on Friday, 19 September, 2003 - 01:14 am:

Hi, sorry about the delay in writing back. I couldn't access NRICH for a while.

I don't know much about frequentism. However I don't see why when you consider only the instances when the second inhabitant verified the first's statement the ratio of true to false statements should be 1/5 to 4/5. For example frequentists wouldn't argue that if you toss a coin repeatedly and only consider the instances where the result is heads then the proportion of heads in these tosses is 1/2. :-)

On your last message; you say:

"If Bob threw a non-7 he would report 7 2/3 of the time, so that's 5/6 x 2/3 = 10/18 = 100/180"

But what if he threw a non-7 and lies about his score but doesn't claim his score is 7?

Michael