Limiting probabilities

From Wikiversity
Jump to navigation Jump to search

The probability that a continuous-time Markov chain will be in state j at time t often converges to a limiting value which is independent of the intial state. We call this value Pj where Pj is equal to:


For a limiting probability to exist, it is necessary that


This condition may be shown to be sufficient.

We can determine the limiting probabilities for a birth and death process using these equations and equating the rate at which the process leaves a state with the rate at which it enters the state.