Markov Analysis

To develop a better understanding of Markov Processes, we need to clarify some concepts. Let us list them out in the following points:

- A state j is said to be accessible from a state i (i->j), when Pi->j (n) > 0.
- States i and j communicate (i<-> j) if both these states are accessible from one another.
- State i is called a stationary state if, Pi->j (n) = Pi->j (n+1)
- A state is said to be a transient state if there is a non-zero probability of returning to this state.
- If a state is not transient, it is a recurrent state.Recurrence time, is defined as the time required to return to this state.
- If, Pi->i = 1, for all n, the state is called absorbing state. Meaning, if arrived, we won’t be able to leave such a state.
- A Markov Chain is said to be irreducible if it has only one communicating class. That is, it is possible to get to any state from any state. This is actually a desirable property as it simplifies our analysis of limiting behaviour (as in the behaviour of our process at infinite time).
- Periodicity is the property of any state i having a period k. That is, number of steps required to return to i are multiples of k.
- Ergodicity is the property of a state i being recurrent and aperiodic. Such a State is thus, called an ergodic state.
- Markov Chain is called time homogenous when T (n) is independent of n, meaning the evolution of system is time independent (assumed so, unless stated otherwise).

SUBMIT HOMEWORK