Multiple Continuous Systems
1. The problem asks to determine all values of \(\alpha \in \mathbb{R}\) such that the system \(\dot{x}(t) = Ax(t)\) contains as many modes as possible.
Given \[ A = \begin{bmatrix} -1 & \alpha & 2 \\ 0 & -2 & -1 \\ 0 & 1 & 0 \end{bmatrix} \]
Step 1: Find the characteristic polynomial \(p(\lambda) = \det(\lambda I - A)\).
\[ \lambda I - A = \begin{bmatrix} \lambda + 1 & -\alpha & -2 \\ 0 & \lambda + 2 & 1 \\ 0 & -1 & \lambda \end{bmatrix} \]
Step 2: Compute determinant:
\[ p(\lambda) = (\lambda + 1) \cdot \det \begin{bmatrix} \lambda + 2 & 1 \\ -1 & \lambda \end{bmatrix} = (\lambda + 1) ((\lambda + 2)\lambda - (-1)(1)) = (\lambda + 1)(\lambda^2 + 2\lambda + 1) = (\lambda + 1)^3 \]
Note that \(\alpha\) does not appear in the characteristic polynomial, so eigenvalues are independent of \(\alpha\).
Step 3: Matrix has eigenvalue \(-1\) with algebraic multiplicity 3.
Step 4: To have as many modes as possible, i.e., maximal number of linearly independent eigenvectors, matrix must be diagonalizable.
Step 5: Compute geometric multiplicity for \(\lambda = -1\).
Step 6: Compute \(A + I\):
\[ A + I = \begin{bmatrix} 0 & \alpha & 2 \\ 0 & -1 & -1 \\ 0 & 1 & 1 \end{bmatrix} \]
Step 7: Find kernel of \(A + I\).
From the system:
\[0\cdot x_1 + \alpha x_2 + 2 x_3 = 0 \]
\[0\cdot x_1 - x_2 - x_3 = 0 \Rightarrow x_2 = -x_3\]
\[0\cdot x_1 + x_2 + x_3 = 0 \Rightarrow x_2 = -x_3 \] (consistent)
Substitute \(x_2 = -x_3\) in first equation:
\[ \alpha (-x_3) + 2 x_3 = 0 \Rightarrow x_3(2 - \alpha) = 0 \]
Step 8: Two cases:
- If \(\alpha \neq 2\), then \(x_3 = 0 \Rightarrow x_2 = 0\), so only trivial solution.
- If \(\alpha = 2\), then \(x_3\) is free, so \(\dim \ker(A+I) \geq 1\).
Step 9: Compute \( \ker((A+I)^2) \) to check size of Jordan blocks:
It can be shown \(\dim \ker((A+I)^2) = 2\) if \(\alpha = 2\).
Step 10: So to get maximal number of modes (3), need \(\alpha = 2\).
---
2. The system:
\[ \dot{x} = A x + B u, \quad y = C x \]
with
\[ A = \begin{bmatrix} -1 & 1 \\ -1 & 0 \end{bmatrix}, B = \begin{bmatrix}0 \\ 1\end{bmatrix}, C = \begin{bmatrix}1 & 0\end{bmatrix} \]
Goal: Find \(\alpha\) and \(x(0)\) so that \(u(t) = e^{\alpha t}\) produces \(y(t) = e^{\alpha t}\) (unity gain).
Step 1: Assume solution form \(x(t) = v e^{\alpha t}\).
Step 2: Substitute into system equation:
\[ \alpha v e^{\alpha t} = A v e^{\alpha t} + B e^{\alpha t} \implies (\alpha I - A) v = B \]
Step 3: Compute \(v = (\alpha I - A)^{-1} B\).
Step 4: Compute \(y(t) = C x(t) = C v e^{\alpha t}\).
Step 5: We need \(C v = 1\) (unity gain).
Step 6: Write system:
\[ \alpha I - A = \begin{bmatrix} \alpha + 1 & -1 \\ 1 & \alpha \end{bmatrix} \]
Step 7: Compute inverse:
\[ \det = (\alpha + 1)(\alpha) - (-1)(1) = \alpha^2 + \alpha + 1 \]
\[ (\alpha I - A)^{-1} = \frac{1}{\alpha^2 + \alpha +1} \begin{bmatrix} \alpha & 1 \\ -1 & \alpha + 1 \end{bmatrix} \]
Step 8: Compute \(v = (\alpha I - A)^{-1} B = \frac{1}{\alpha^2 + \alpha + 1} \begin{bmatrix} \alpha \cdot 0 + 1 \cdot 1 \\ -1 \cdot 0 + (\alpha + 1) \cdot 1 \end{bmatrix} = \frac{1}{\alpha^2 + \alpha + 1} \begin{bmatrix} 1 \\ \alpha + 1 \end{bmatrix} \)
Step 9: Compute output gain:
\[ C v = \frac{1}{\alpha^2 + \alpha + 1} [1 \quad 0] \begin{bmatrix} 1 \\ \alpha + 1 \end{bmatrix} = \frac{1}{\alpha^2 + \alpha + 1} \]
Step 10: Set \(\frac{1}{\alpha^2 + \alpha + 1} = 1 \Rightarrow \alpha^2 + \alpha + 1 = 1 \Rightarrow \alpha^2 + \alpha = 0\)
Step 11: Solve for \(\alpha\): \(\alpha(\alpha + 1)=0 \Rightarrow \alpha = 0 \text{ or } \alpha = -1\)
Step 12: Choose any \(x(0)\); zero suffices.
---
3. Rabbit pairs problem.
Step 1: Define states:
- \(x_1(k)\): number of productive pairs at month \(k\)
- \(x_2(k)\): number of new pairs born this month (non-productive)
Step 2: Each month, productive pairs produce 2 new pairs:
\[ x_2(k+1) = 2 x_1(k) \]
Step 3: The new pairs become productive next month:
\[ x_1(k+1) = x_1(k) + x_2(k) \]
Step 4: Write state-space system:
\[ \begin{bmatrix} x_1(k+1) \\ x_2(k+1) \end{bmatrix} = \begin{bmatrix} 1 & 1 \\ 2 & 0 \end{bmatrix} \begin{bmatrix} x_1(k) \\ x_2(k) \end{bmatrix} \]
Step 5: Initial conditions:
\[ x_1(0) = 0, \quad x_2(0) = 1 \] (initially 1 pair newborn)
Step 6: Find explicit solution:
Find eigenvalues \(\lambda\) of \(A = \begin{bmatrix}1 & 1 \\ 2 & 0\end{bmatrix}\):
\[ \det(\lambda I - A) = (\lambda -1)(\lambda) - 2 = \lambda^2 - \lambda - 2 = 0 \]
Step 7: Solve quadratic:
\[ \lambda = \frac{1 \pm \sqrt{1 + 8}}{2} = \frac{1 \pm 3}{2} \Rightarrow \lambda_1 = 2, \lambda_2 = -1 \]
Step 8: Diagonalize or use linear recurrence:
Step 9: The scalar sequence of total pairs \(n_k = x_1(k) + x_2(k)\) satisfies:
\[ n_k = 2 n_{k-1} + n_{k-2} \linebreak \text{with } n_0 =1, n_1=2\]
Step 10: Solve recurrence explicitly:
\[ n_k = A (2)^k + B (-1)^k \]
Using initial conditions:
\[ n_0= A + B =1 \]
\[ n_1 = 2A - B = 2 \]
Solves to:
\[ A=1, B=0 \]
Step 11: Hence
\[ \boxed{ n_k = 2^k } \]
So number of pairs at month \(k\) is \(2^k\).
---
4. Analyze stability of \(\dot{x} = Ax,\)
where
\[ A = \begin{bmatrix} \alpha & 4 \\ -1 & 2 \end{bmatrix} \]
Step 1: Characteristic polynomial:
\[ p(\lambda) = \det(\lambda I - A) = \begin{vmatrix} \lambda - \alpha & -4 \\ 1 & \lambda - 2 \end{vmatrix} = (\lambda - \alpha)(\lambda - 2) + 4 = \lambda^2 - (\alpha + 2)\lambda + (2 \alpha + 4) \]
Step 2: Stability requires eigenvalues with nonpositive real part.
Step 3: Routh-Hurwitz criteria for stability:
\[ a_1 = \alpha + 2 > 0, \quad a_0 = 2 \alpha + 4 > 0 \]
Rewrite:
- \(\alpha + 2 > 0 \Rightarrow \alpha > -2\)
- \(2 \alpha + 4 > 0 \Rightarrow \alpha > -2\)
So for \(\alpha > -2\), eigenvalues have positive coefficients.
Step 4: Investigate purely imaginary roots for boundary of asymptotic stability.
Set \(\lambda = j \omega\) and apply characteristic polynomial:
\[ - \omega^2 - j (\alpha + 2) \omega + 2 \alpha + 4 = 0 \]
Equate real and imaginary parts to zero:
\[ \begin{cases} -\omega^2 + 2\alpha + 4 = 0 \\ - (\alpha + 2) \omega =0 \end{cases} \]
Step 5: From imaginary part:
\[ -(\alpha + 2) \omega=0 \Rightarrow \omega = 0 \text{ or } \alpha = -2 \]
For \(\omega = 0\), leads to real root.
For \(\alpha = -2\), check real part:
\[ -\omega^2 + 2(-2) + 4 = -\omega^2 -4 + 4 = -\omega^2 =0 \implies \omega=0 \]
Thus \(\alpha = -2\) corresponds to eigenvalue zero with multiplicity.
Step 6: At \(\alpha = -2\), eigenvalues are:
\[ \lambda^2 - 0 \lambda + 0 = \lambda^2 = 0 \]
Eigenvalue 0 with multiplicity 2.
Step 7: Stability is not asymptotic because eigenvalue 0 exists, but origin is stable (not asymptotically) if \(\alpha = -2\).
---
5. Analyze stability for system:
\[ \dot{x}_1 = x_1 x_2^2 \]
\[ \dot{x}_2 = -3 x_1^2 x_2 \]
Step 1: Consider Lyapunov function candidate:
\[ V = \frac{1}{2} x_1^2 + \frac{1}{2} x_2^2 \]
Step 2: Compute \( \dot{V} = x_1 \dot{x}_1 + x_2 \dot{x}_2 = x_1^2 x_2^2 - 3 x_1^2 x_2^2 = -2 x_1^2 x_2^2 \leq 0 \]
Step 3: \( \dot{V} \leq 0 \), so the system is Lyapunov stable.
Step 4: Check if \( \dot{V} = 0 \) only at origin:
\(\dot{V}=0\) if \(x_1=0\) or \(x_2=0\), so not only trivial solution.
Step 5: By LaSalle's invariance principle, limit set includes axes.
Step 6: Hence, origin is stable but not asymptotically stable.
---
Final answers:
1. \(\boxed{\alpha = 2}\)
2. \(\boxed{\alpha = 0 \text{ or } -1, \quad x(0) = 0}\)
3. \(\boxed{n_k = 2^k}\)
4. \(\boxed{\alpha = -2}\); origin stable but not asymptotically stable
5. Origin is Lyapunov stable but not asymptotically stable.