contents.gifindex.gifprev1.gifnext1.gif

Markov Example

Assume you have a community of 100 people. Initially, 83 of the people are healthy and 17 are sick. You predict that each year 20% of the healthy people will get sick. Furthermore, 25% of the sick people will die and the remainder will get better. You want to know how many people will still be alive after 10, 20, 30, years.

The first step in building a model to solve this problem is to point to New in the File menu and click Markov Model. This brings up the following model template:

dph00316.gif

At any particular time, a person can be in one of three states: Healthy, Sick, or Dead. Since the template has two states defined, double-click on the circle following Root to add a new state node. Like decision tree models, Markov models can be built largely by double-clicking to add branches and editing directly in the tree. This frees you from creating node definitions directly.

dph00317.gif

Next, rename the state nodes to Healthy, Sick, and Dead.

dph00318.gif

You also know that you start out with 83 people in the Healthy state, 17 people in the Sick state, and no people in the Dead state. You reflect this in the model by editing the numbers following Root.

dph00319.gif

If a person is healthy, you know that there is an 80% chance that they will remain healthy and a 20% chance that they will become sick. You can show this in the tree by editing the percents following Healthy:

dph00320.gif

If a person is sick, there is a 25% chance that they will die and a 75% chance that they will return to the Healthy state. To reflect this in the model, edit Sick as follows:

Sick:=mkv(75%,Healthy,25%,Dead)

dph00321.gif

The model is now complete. To start the model, click on the Reset button. This will cause the tree to show the following state information:

dph00322.gif

This is the initial stage--there are 83 healthy people and 17 sick people. Click Single Step to progress to the next stage.

dph00323.gif

Click again to progress to the third stage.

dph00324.gif

Each time you click Single Step, the model progresses one more stage. Eventually, all of the people will be in the Dead state.

Note that even though you can never have a fraction of a person in one state, the Markov model does not round values to the nearest integer.