Markov Chains
A Markov chain is a sequence of random variables (a.k.a stochastic process) denoted by
Transition Matrix
Markov chain can be represented by a transition matrix
TIP
The probability of path
States
- Recurrent state :
- Transient state :.
- Absorbing state : State
is absorbing if it is impossible to leave this state ( , ).
Stationary Distribution
The vector
As a stationary distribution satisfies
Absorption Probability
For an absorption state
(if we are in the absorbing state, we are already there) (for all transient states ). This is equivalent to dot product of vector and row of transition matrix . (for other absorbing states ). This means that we cannot reach the 'th absorbing state from absorption state .