Subjects probability and stochastic processes

Markov Chain Analysis

Step-by-step solutions with LaTeX - clean, fast, and student-friendly.

Search Solutions

Markov Chain Analysis


1. **Problem Statement**: Verify properties and analyze the discrete-time Markov chain with transition matrix $$P=\begin{pmatrix} 0.5 & 0.3 & 0.2 \\ 0.1 & 0.6 & 0.3 \\ 0.4 & 0.0 & 0.6 \end{pmatrix}.$$ 2. **(a) Validity of transition matrix**: - Each entry is between 0 and 1 and each row sums to 1: Row 1 sum: $0.5 + 0.3 + 0.2 = 1.0$ Row 2 sum: $0.1 + 0.6 + 0.3 = 1.0$ Row 3 sum: $0.4 + 0.0 + 0.6 = 1.0$ - Since all rows sum to 1 and entries are probabilities, $P$ is a valid transition probability matrix. 3. **(b) Compute $P^{2} = P \times P$**: Calculate each element $p_{ij}^{(2)} = \sum_k p_{ik} p_{kj}$. - Row 1: $p^{(2)}_{11} = 0.5*0.5 + 0.3*0.1 + 0.2*0.4 = 0.25 + 0.03 + 0.08 = 0.36$ $p^{(2)}_{12} = 0.5*0.3 + 0.3*0.6 + 0.2*0.0 = 0.15 + 0.18 + 0 = 0.33$ $p^{(2)}_{13} = 0.5*0.2 + 0.3*0.3 + 0.2*0.6 = 0.1 + 0.09 + 0.12 = 0.31$ - Row 2: $p^{(2)}_{21} = 0.1*0.5 + 0.6*0.1 + 0.3*0.4 = 0.05 + 0.06 + 0.12 = 0.23$ $p^{(2)}_{22} = 0.1*0.3 + 0.6*0.6 + 0.3*0.0 = 0.03 + 0.36 + 0 = 0.39$ $p^{(2)}_{23} = 0.1*0.2 + 0.6*0.3 + 0.3*0.6 = 0.02 + 0.18 + 0.18 = 0.38$ - Row 3: $p^{(2)}_{31} = 0.4*0.5 + 0.0*0.1 + 0.6*0.4 = 0.2 + 0 + 0.24 = 0.44$ $p^{(2)}_{32} = 0.4*0.3 + 0.0*0.6 + 0.6*0.0 = 0.12 + 0 + 0 = 0.12$ $p^{(2)}_{33} = 0.4*0.2 + 0.0*0.3 + 0.6*0.6 = 0.08 + 0 + 0.36 = 0.44$ Thus, $$P^{2}=\begin{pmatrix} 0.36 & 0.33 & 0.31 \\ 0.23 & 0.39 & 0.38 \\ 0.44 & 0.12 & 0.44 \end{pmatrix}.$$ 4. **(c) Classification of states and irreducibility:** - The chain has positive probabilities connecting all states (directly or through others). - Because from every state, there is a positive chance to reach others in some steps, the chain is irreducible. - No rows of $P$ equal the identity vector, so no absorbing states exist. - All states are recurrent because the chain is finite, irreducible, and aperiodic. 5. **(d) Verify $P^{3} = P^{2} P$ using Chapman-Kolmogorov:** Calculate $P^{3} = P^{2} P$: - Compute each element $p^{(3)}_{ij} = \sum_k p^{(2)}_{ik} p_{kj}$. - For example, element $(1,1)$: $p^{(3)}_{11} = 0.36*0.5 + 0.33*0.1 + 0.31*0.4 = 0.18 + 0.033 + 0.124 = 0.337$ - Similar calculations show $P^{3} = P^{2} P$ holds componentwise, verifying the Chapman-Kolmogorov equation. 6. **(e) Find stationary distribution $oldsymbol{oldsymbol\pi = (\pi_1, \pi_2, \pi_3)}$ s.t. $oldsymbol{\pi P = \pi}$ and $\sum \pi_i = 1$:** - Write system: $\pi_1 = 0.5 \pi_1 + 0.1 \pi_2 + 0.4 \pi_3$ $\pi_2 = 0.3 \pi_1 + 0.6 \pi_2 + 0.0 \pi_3$ $\pi_3 = 0.2 \pi_1 + 0.3 \pi_2 + 0.6 \pi_3$ - Use $\pi_1 + \pi_2 + \pi_3 = 1$. - Rearranged: $0 = -0.5 \pi_1 + 0.1 \pi_2 + 0.4 \pi_3$ $0 = 0.3 \pi_1 -0.4 \pi_2 + 0 \pi_3$ $0 = 0.2 \pi_1 + 0.3 \pi_2 -0.4 \pi_3$ - From the second, $0.3 \pi_1 = 0.4 \pi_2 \Rightarrow \pi_2 = \frac{0.3}{0.4} \pi_1 = 0.75 \pi_1$ - From the third, $0.2 \pi_1 + 0.3 \pi_2 = 0.4 \pi_3$ Substitute $\pi_2$: $0.2 \pi_1 + 0.3 (0.75 \pi_1) = 0.4 \pi_3$ $0.2 \pi_1 + 0.225 \pi_1 = 0.4 \pi_3$ $0.425 \pi_1 = 0.4 \pi_3 \Rightarrow \pi_3 = \frac{0.425}{0.4} \pi_1 = 1.0625 \pi_1$ - Sum to 1: $\pi_1 + \pi_2 + \pi_3 = \pi_1 + 0.75 \pi_1 + 1.0625 \pi_1 = 2.8125 \pi_1 =1$ $\Rightarrow \pi_1 = \frac{1}{2.8125} \approx 0.3556$ - Then, $\pi_2 = 0.75 \times 0.3556 = 0.2667$ $\pi_3 = 1.0625 \times 0.3556 = 0.3778$ **Interpretation:** The stationary distribution represents long-run probabilities to find the chain in each state. 7. **(f) Modify $P$ so that state 3 is absorbing:** - Set $p_{33} = 1$ and $p_{31} = p_{32} = 0$. - New matrix: $$ \tilde{P} = \begin{pmatrix} 0.5 & 0.3 & 0.2 \\ 0.1 & 0.6 & 0.3 \\ 0 & 0 & 1 \end{pmatrix}.$$ - Transient states: 1 and 2; - Absorbing state: 3. Partition into transient $Q$ and absorbing parts: $$Q = \begin{pmatrix} 0.5 & 0.3 \\ 0.1 & 0.6 \end{pmatrix}.$$ - Identity matrix $I$ is $2 \times 2$ identity. - Compute fundamental matrix: $$N = (I - Q)^{-1} = \left( \begin{pmatrix}1 & 0 \\0 & 1 \end{pmatrix} - \begin{pmatrix}0.5 & 0.3 \\ 0.1 & 0.6 \end{pmatrix} \right)^{-1} = \begin{pmatrix}0.5 & -0.3 \\ -0.1 & 0.4 \end{pmatrix}^{-1}.$$ - Determinant: $\det = 0.5 \times 0.4 - (-0.3)(-0.1) = 0.2 - 0.03 = 0.17$. - Inverse: $$N = \frac{1}{0.17} \begin{pmatrix}0.4 & 0.3 \\ 0.1 & 0.5 \end{pmatrix} = \begin{pmatrix} 2.3529 & 1.7647 \\ 0.5882 & 2.9412 \end{pmatrix}.$$ (i) **Expected steps before absorption starting from transient states:** - Vector $t = N \mathbf{1} = \begin{pmatrix} 2.3529 + 1.7647 \\ 0.5882 + 2.9412 \end{pmatrix} = \begin{pmatrix}4.1176 \\ 3.5294 \end{pmatrix}.$ - So expected steps before absorption starting at state 1 is about 4.12 and at state 2 is about 3.53. (ii) **Probability of eventual absorption:** - Since state 3 is absorbing and chain is finite, absorption probability starting from any transient state is 1. **Summary:** - $P$ is valid. - Computed $P^2$ correctly. - Chain is irreducible, all states recurrent. - Chapman–Kolmogorov equation verified. - Stationary distribution approximated as $(0.3556, 0.2667, 0.3778)$. - When state 3 absorbing, fundamental matrix $N$ found. - Expected times to absorption and absorption probabilities given.