I seem to be having huge difficulties with setting up a probability transition matrix for a Markov chain. I'm not sure where to start or what to do, especially if it's beyond basic.
For example, how would I construct a matrix for:
Suppose that if it has rained for the past 3 days, then it will rain today with probability 0.8; if it did not rain for the past 3 days then it will rain today with probability 0.2; and in anyother case the weather today will, with probability 0.6, be the same as the weather yesterday.
I just can't seem to wrap my head around how to do this...
2007-12-04
18:34:16
·
1 answers
·
asked by
yogastar02
2
in
Science & Mathematics
➔ Mathematics