The Student Room Group

Easy to understand, difficult to solve... Any help with this probability question?

I came up with this problem by complexifying a much easier one, and I've become obsessed by it as I can't seem to solve it!

Drop a regular tetrahedron with sides numbered 0, 1, 2, 3. Get 0, end of game, get 1, drop it once again, get 2, drop it twice, get 3, drop it thrice.
When playing, if ever you must drop it X>1 times, then each of the results add up giving the number of times you will have to drop it again.
Ex: Drop twice: results 1 and 3: Drop four times.

What is the probability the game will end?


I have succeeded in getting somewhere, but I'm not sure how to sort out a problem I've encountered... (see Spoiler)

By the way, the problem becomes even more interesting by replacing a tetrahedron with an N-sided object, and searching for a result which would depend (or not!) on N.

Spoiler

(edited 12 years ago)

Scroll to see replies

Original post by z0tx
I came up with this problem by complexifying a much easier one, and I've become obsessed by it as I can't seem to solve it!

Drop a regular tetrahedron with sides numbered 0, 1, 2, 3. Get 0, end of game, get 1, drop it once again, get 2, drop it twice, get 3, drop it thrice.
When playing, if ever you must drop it X>1 times, then each of the results add up giving the number of times you will have to drop it again.
Ex: Drop twice: results 1 and 3: Drop four times.

What is the probability the game will end?


I have succeeded in getting somewhere, but I'm not sure how to sort out a problem I've encountered... (see Spoiler, don't look if you want to try it out on your own)

By the way, the problem becomes even more interesting by replacing a tetrahedron with an N-sided object, and searching for a result which would depend (or not!) on N.

Spoiler



Interesting question.

I can't see the justification for your equation though.

I think the solution to the original problem lies in the realm of Stochastic processes and random walks; not something I've studied :frown:
Reply 2
Well intuitively the probability the game will end is just 1. Because if it doens't end on the nth go you just throw again till it does end?
Reply 3
Original post by ghostwalker

I can't see the justification for your equation though.


No that you mention on it I'm not sure how I got there. Well then I'm possibly at the starting line!

Original post by spex
Well intuitively the probability the game will end is just 1. Because if it doens't end on the nth go you just throw again till it does end?


Yes indeed, but intuition or even 'logical' evidence isn't enough in maths, as you know. I did question the possibility of it being 1 though. Here is why: as the number of drops you will have to do on the next turn is the sum of the results you just got, with time you have more and more drops on your hand, therefore a smaller probability of ending the game. For instance:

Round A. Drop once: 2
Round B: Drop twice: 3, 1
Round C: Drop four times: 1, 3, 2, 1
Round D: Drop seven times: 1, 2, 0, 1, 3, 3
...

As you can see, here on round D you would have to get 0 on every one of the seven drops for the game to end (a slim 1/(4^7)...). So this final probability we are looking for is the infinite sum of numbers which are getting infinitely smaller. 1 is of course a possibility, but not necessarily a certainty.

This becomes even more the case when you're dealing with an N-sided object. Let's say it has 1000 sides. You're going to start of with 1/1000 chance of ending the game on the first try, 999/1000 that you'll get another number which could possibily make the game last for a LONG time. Let's say you drop 53. On the next round you would have to drop 53 times 0 to end it. Seriously how is that ever going to happen? :smile: Maybe it is 1 anyway, which would just add to the quirkiness of maths!
(edited 13 years ago)
My thinking, as far as it goes is:

Rather than considering rounds, we can consider each drop individually, then:

If at any given stage we denote the number of drops required is n by DnD_n, then

Starting at D1D_1

At each drop:

DnD_n goes to Dn1D_{n-1} with probability 1/4
DnD_n goes to DnD_{n} with probability 1/4
DnD_n goes to Dn+1D_{n+1} with probability 1/4
DnD_n goes to Dn+2D_{n+2} with probability 1/4

And the game ends if we get to D0D_0
Original post by spex
Well intuitively the probability the game will end is just 1. Because if it doens't end on the nth go you just throw again till it does end?


That's logical but not mathematical, there is a chance the game could carry on indefinitely.
(edited 13 years ago)
Reply 6
Original post by ghostwalker
My thinking, as far as it goes is:

Rather than considering rounds, we can consider each drop individually, then:

If at any given stage we denote the number of drops required is n by DnD_n, then

Starting at D1D_1

At each drop:

DnD_n goes to Dn1D_{n-1} with probability 1/4
DnD_n goes to DnD_{n} with probability 1/4
DnD_n goes to Dn+1D_{n+1} with probability 1/4
DnD_n goes to Dn+2D_{n+2} with probability 1/4

And the game ends if we get to D0D_0


That is an excellent approach, 'never thought of it that way. I'll think about this a little and post something if I get anywhere!
Reply 7
Original post by Mr Einstein
That's logical but not mathematical, there is a chance the game could carry on indefinitely.


There's certainly ways in which the game could go on forever (e.g. throwing a 1 every time), but it's still possible for the game to end with probability 1.
Reply 8
Original post by Mr Einstein
That's logical but not mathematical, there is a chance the game could carry on indefinitely.


Consider the the real numbers between 0 and 1. What's the probability of randomly picking a number and getting 0.5? The probability is actually zero. Not particularly intuitive but if you think about it for a while you'll see it has to be true. The same logic applies to the above problem.
Reply 9
I'll post my solution tommorrow, I think I've solved it.
Original post by Tobedotty
I'll post my solution tommorrow, I think I've solved it.


Looking forward to it.

(shameless bump)
Reply 11
Right, here is my approach. First, split up the groups of rolls into sections (I'll just call them sections from now on) where the first section is the first roll, the second section is the number of rolls determined by the first roll, the third section is the number of rolls determined by the sum of the rolls in the second section and so on and so forth (Just like the op said). Now if we let SnS_n denote the nth section and P(Sn)P(S_n) denoting the probability of failing by section n. Finally, let RnR_n denote the nth roll in a certain section and P(Rn)P(R_n) denote the probability of failing after n rolls.

Lets begin by evaluating P(Rn)P(R_n), because this will probably (ha) come up over and over again. Well obviously the probability of failing after the first roll is 14 \frac{1}{4}. For the probability of failing after second roll for which we need to succeed on the first roll then fail on the second but, This means the probability of failing after the second roll is: 14+34×14\frac{1}{4}+\frac{3}{4}\times \frac{1}{4}

Taking this argument further, P(R3)P(R_3) is equal to the probability of failing on the first roll plus the probability of suceeding the first and failing the second plus the probility of suceeding the first suceeding the second and failing the third which is: P(R3)=14+34×14+34×34×14P(R_3) = \frac{1}{4}+\frac{3}{4} \times \frac{1}{4}+\frac{3}{4} \times \frac{3}{4} \times \frac{1}{4}

This looks exactly like a geometric series. This means that P(Rn)=14+34×14+(34)2×14++(34)n×14P(R_n)=\frac{1}{4}+\frac{3}{4} \times \frac{1}{4}+(\frac{3}{4})^2 \times \frac{1}{4}+ \ldots + (\frac{3}{4})^n \times \frac{1}{4} . Applying the geometric series sum formula gives P(Rn)=14(134n)134P(R_n)= \frac{ \frac{1}{4} (1-\frac{3}{4}^n) }{1-\frac{3}{4}} The 1/4 on the top and the 1-(3/4) on the bottom cancel leaving P(Rn)=1(34)nP(R_n)=1-(\frac{3}{4})^n

- - - - -

What we wish to evaulate is P(S)P(S_\infty). Each section must contain atleast one roll else we have failed. This means that at any point no. of rolls so farno. of sections so far no. \ of \ rolls \ so \ far \geq no. \ of \ sections \ so \ far which means that if we have had an infinite number of sections, we have also had an infinite number of rolls. The probability of failing after a finite number of rolls is: P(Rn)=1(34)nP(R_n)=1-(\frac{3}{4})^n so taking the limits for when n approaches infinity we can say that: as n approachs infinity, P(Rn)P(R_n) approachs 1.

I hope this is correct I don't have a good track record :tongue:
(edited 12 years ago)
Original post by Tobedotty
...


:holmes:

Not sure what you mean by P(R1)P(R_1) for instance. From what you're saying, I take this to mean the probability of failing after the first roll, in any section. What do you mean by failing? Do you mean the game ends?
I believe P(R_n) < 1. (Edit: as n -> infinity).
(edited 12 years ago)
OK, quick revision of this, and I think 21\sqrt{2}-1 is the correct answer. Although I don't agree with the justification for the cubic in the 1st post, I do agree that p must satisfy that cubic. See http://www.am.qub.ac.uk/users/g.gribakin/sor/Chap3.pdf (section 3.7). As described, the scenarios are not obviously equivalent, but a little thought convinces me they are actually the same, at least as far as extinction goes.
Reply 15
I agree with DFranklin. (Using Markov chains, we find that the hitting probability is the unique, minimal, non-negative solution to

h_0 = 1
h_i = A+B(sqrt(2)-1)^i+C(-1-sqrt(2))^i
(edited 12 years ago)
Original post by DFranklin
....


Cheers, and the useful reference.
Original post by ghostwalker
Cheers, and the useful reference.
I was able to drag up "extinction probability" from memory - the wiki page made me sure I was on the right track, but it was then annoyingly difficult to pin down the actual solution. (The wiki page was a bit too general).

As I recall it's a fairly standard topic, so either the search terms or my google fu are weak tonight.
Reply 18
These answers are all very interesting, thanks!

Original post by SimonM
Using Markov chains, we find that the hitting probability is the unique, minimal, non-negative solution to
h_0 = 1
h_i = A+B(sqrt(2)-1)^i+C(-1-sqrt(2))^i


Would you by any chance care to detail this? :smile: Havn't got to Markov chains yet...
Reply 19
Original post by z0tx
Would you by any chance care to detail this? :smile: Havn't got to Markov chains yet...


Not really, there are some good resources out there:

http://www.statslab.cam.ac.uk/~james/Markov/ (1.3)

Quick Reply

Latest