# Markov Model Exercise

A markov model is a way to represent a changing set of health states
over time, where there is a known probability or rate of transition from
one health state to another. An excellent discussion of these
models can be found in Sonnenberg and Beck's 1993 article
"Markov Models in Medical Decision Making: A practical guide"
(in *Medical Decision Making, 13:322-338*).
For this exercise,
consider a hypothetical Markov model for a group of patients with
controlled diabetes.

## Building the model

To build a markov model, we must specify a set of states, probabilities
that a patient will move from one state to another, and the utilities
of living in each state.
### States

For simplicity, we will assume three states:
- Controlled diabetes
- End-stage renal disease (ESRD)
- Death

### Transition probabilities

Let's imagine that, each year, patients with controlled diabetes have an
85% chance of staying in that state, a
10% chance of moving to the ESRD state, and a 5% chance of dying.
This is obviously a simplification, and it's possible to build
Markov models that allow (for example) a chance of death that increases
over time (background mortality).
Let's further imagine that, each year, patients with ESRD have a
70% chance of continuing in the ESRD state and a 30% chance of dying.

Patients who have died have a 100% chance of staying dead.
Death is thus an "absorbing state" -- a state which, once entered,
can not be left.

### Utilities

Imagine that the utility for living with controlled diabetes is
0.95, the utility for ESRD is 0.30, and the utility of death
is, by definition, 0.
## Thinking about the model

Here are some questions to try to answer about the
model. Make your best estimate.
- Imagine that we begin with 10,000 patients with controlled
diabetes. In one year, how many patients will still have
controlled diabetes? How many will have ESRD? How many will have died?
- How much utility per patient will have been experienced during that year?
Remember that each diabetes patient experiences a utility of 0.95
for the year, each ESRD patient experiences a utility of 0.30 for
the year, and dead people experience no utility.
- In 10 years, how many patients do you estimate will be in
each state?
- Future health is often considered to be less valuable
than immediate health. This is handled in some models by discounting
future utility by some constant rate. For example, if we use a 3%
discount rate, the utility of being in any particular state for
a year is worth only 97% next year of what it was worth this year,
and 94% (= 97% * 97%) in the second year of what it was worth this year.
Imagine that after 20 years, without discounting, our patients have
experienced an average of 6.3 utility units per patient during those
10 years. How much would you guess they would have experienced if
we were discounting the utilities by 3% per year?
- How many years will it take before all of the patients are dead?

## Running the model

Now go back and try to answer those questions using an actual
simulation of this Markov model with 10,000 patients, which you'll find at
this link.
You can use the "Next Cycle" button to step year-by-year through
the simulation and see how the patients move from state to state and
watch the total experienced utility increase. You can use the
"Run to End" button to run the complete simulation until the
cycle at which all of the patients would be dead. If you check
the "discount utilities" checkbox, the simulation will use
a 3% utility discount rate.

Compare your estimates above with what you can learn from running the model.
How close were you? What did you find surprising?

*Note:* If you computed an answer to question 2 above, your
answer may not match the model's answer, because the model performs
the "half-cycle correction". See the Sonnenberg and Beck article for
more detail.