Im not sure which of these are markov chains and which not
Suppose that are independent, identically distributed random variables such that . Set . In each of the following cases determine whether (Y_n)_(n>=0) is a Markov chain.
In the cases where Y_n is a Markov chain, i have to give the state space and the transition matrix. and if its not i have to say why.
Im really stuck, could you give me some help please?
Turn on thread page Beta
Markov Chains watch
- Thread Starter
- 23-10-2006 17:03
- 23-10-2006 20:09
I hope yours wasn't due on for 6 like mine.
c isn't as P(Y4=7|Ysub]3[/sub]=3,Y2=1)=p
the others are, just try drawing state spaces and get the transition matrix from that. It's not too bad.