English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I seem to be having huge difficulties with setting up a probability transition matrix for a Markov chain. I'm not sure where to start or what to do, especially if it's beyond basic.

For example, how would I construct a matrix for:
Suppose that if it has rained for the past 3 days, then it will rain today with probability 0.8; if it did not rain for the past 3 days then it will rain today with probability 0.2; and in anyother case the weather today will, with probability 0.6, be the same as the weather yesterday.

I just can't seem to wrap my head around how to do this...

2007-12-04 18:34:16 · 1 answers · asked by yogastar02 2 in Science & Mathematics Mathematics

1 answers

No wonder you're having trouble. That's not a Markov chain.

Is the probability that it's rainy today dependent solely on whether it was rainy yesterday? No.

Is the probability that it's (say) Sunny-Rainy-Sunny on Monday-Tuesday-Wednesday depended solely on the weather the prior Friday-Saturday-Sunday? Also No.

It is (I think) impossible to express that scenario as a Markov chain.

2007-12-05 05:22:08 · answer #1 · answered by Curt Monash 7 · 0 0

fedest.com, questions and answers