Markov Example
Assume you have a community of 100 people. Initially, 83 of
the people are healthy and 17 are sick. You predict that each
year 20% of the healthy people will get sick. Furthermore, 25% of
the sick people will die and the remainder will get better. You
want to know how many people will still be alive after 10, 20,
30, years.
The first step in building a model to solve this problem is to
point to New in the File menu and click Markov
Model. This brings up the following model template:
At any particular time, a person can be in one of three
states: Healthy, Sick, or Dead. Since the
template has two states defined, doubleclick on the circle
following Root to add a new state node. Like decision tree
models, Markov models can be built largely by doubleclicking to
add branches and editing directly in the tree. This frees you
from creating node definitions directly.
Next, rename the state nodes to Healthy, Sick,
and Dead.
You also know that you start out with 83 people in the Healthy
state, 17 people in the Sick state, and no people in the Dead
state. You reflect this in the model by editing the numbers
following Root.
If a person is healthy, you know that there is an 80% chance
that they will remain healthy and a 20% chance that they will
become sick. You can show this in the tree by editing the
percents following Healthy:
If a person is sick, there is a 25% chance that they will die
and a 75% chance that they will return to the Healthy
state. To reflect this in the model, edit Sick as follows:
Sick:=mkv(75%,Healthy,25%,Dead)
The model is now complete. To start the model, click on the Reset
button. This will cause the tree to show the following state
information:
This is the initial stagethere are 83 healthy people and 17
sick people. Click Single Step to progress to the next
stage.
Click again to progress to the third stage.
Each time you click Single Step, the model progresses
one more stage. Eventually, all of the people will be in the Dead
state.
Note that even though you can never have a fraction of a
person in one state, the Markov model does not round values to
the nearest integer.
