Probability Statements
1. **Problem 1: True or False statements about probability and distributions**
(a) Given $B \subset A$ and $P(B) > 0$, check if $P(A|B) \leq P(B|A)$.
- Recall conditional probability: $P(A|B) = \frac{P(A \cap B)}{P(B)}$ and $P(B|A) = \frac{P(A \cap B)}{P(A)}$.
- Since $B \subset A$, $A \cap B = B$.
- So, $P(A|B) = \frac{P(B)}{P(B)} = 1$ and $P(B|A) = \frac{P(B)}{P(A)}$.
- Because $P(B) \leq P(A)$, $\frac{P(B)}{P(A)} \leq 1$.
- Therefore, $P(A|B) = 1 \geq P(B|A)$, so the statement $P(A|B) \leq P(B|A)$ is FALSE.
(b) If $A$ and $B$ are independent, then $A^C$ and $B^C$ are also independent.
- Independence means $P(A \cap B) = P(A)P(B)$.
- Using complements, $P(A^C \cap B^C) = 1 - P(A) - P(B) + P(A \cap B)$.
- Substitute independence: $= 1 - P(A) - P(B) + P(A)P(B) = (1 - P(A))(1 - P(B)) = P(A^C)P(B^C)$.
- Hence, $A^C$ and $B^C$ are independent. Statement is TRUE.
(c) For independent $X \sim \text{Bin}(n_1, p_1)$ and $Y \sim \text{Bin}(n_2, p_2)$ with $n_1 \neq n_2$, $p_1 \neq p_2$, check if $X + Y \sim \text{Bin}(n_1 + n_2, p_1 + p_2)$.
- The sum of independent binomial variables with the same $p$ is binomial with summed $n$.
- Here $p_1 \neq p_2$, so $X+Y$ is NOT binomial with parameter $p_1 + p_2$.
- Statement is FALSE.
(d) For independent $X \sim N(\mu_1, \sigma_1^2)$ and $Y \sim N(\mu_2, \sigma_2^2)$, check if $X + Y \sim N(\mu_1 + \mu_2, \sigma_1^2 + \sigma_2^2)$.
- Sum of independent normal variables is normal with mean sum and variance sum.
- Statement is TRUE.
2. **Problem 2: True or False statements about sample mean and variance distributions**
Given $X_i \sim N(\mu, \sigma^2)$ i.i.d., $\bar{X} = \frac{1}{n} \sum X_i$, and $S^2 = \frac{1}{n-1} \sum (X_i - \bar{X})^2$.
(a) $\bar{X} \sim N(\mu, \frac{\sigma^2}{n})$.
- The sample mean of normal variables is normal with mean $\mu$ and variance $\sigma^2/n$.
- TRUE.
(b) $\frac{(n-1)S^2}{\sigma^2} \sim \chi_n^2$.
- The scaled sample variance follows $\chi_{n-1}^2$ distribution, not $\chi_n^2$.
- FALSE.
(c) $\sum \frac{(X_i - \mu)^2}{\sigma^2} \sim \chi_n^2$.
- Sum of squared standardized normal variables is $\chi_n^2$.
- TRUE.
(d) $\frac{\sqrt{n}(\bar{X} - \mu)}{S} \sim t_{n-1}$.
- This is the definition of the t-statistic with $n-1$ degrees of freedom.
- TRUE.
3. **Problem 3: Expectation calculations for joint pdf**
Given joint pdf:
$$f(x_1, x_2) = \frac{2}{\theta^2} e^{-\frac{x_1 + x_2}{\theta}}, \quad 0 < x_1 < x_2 < \infty, \theta > 0$$
(a) Find $E(X_1)$.
- First find marginal pdf of $X_1$:
$$f_{X_1}(x_1) = \int_{x_2 = x_1}^\infty f(x_1, x_2) dx_2 = \int_{x_2 = x_1}^\infty \frac{2}{\theta^2} e^{-\frac{x_1 + x_2}{\theta}} dx_2 = \frac{2}{\theta^2} e^{-\frac{x_1}{\theta}} \int_{x_2 = x_1}^\infty e^{-\frac{x_2}{\theta}} dx_2$$
- Evaluate integral:
$$\int_{x_1}^\infty e^{-\frac{x_2}{\theta}} dx_2 = \theta e^{-\frac{x_1}{\theta}}$$
- So,
$$f_{X_1}(x_1) = \frac{2}{\theta^2} e^{-\frac{x_1}{\theta}} \theta e^{-\frac{x_1}{\theta}} = \frac{2}{\theta} e^{-\frac{2x_1}{\theta}}, \quad x_1 > 0$$
- This is an exponential distribution with rate $\lambda = \frac{2}{\theta}$.
- Hence,
$$E(X_1) = \frac{1}{\lambda} = \frac{\theta}{2}$$
(b) Find $E(X_2 | X_1 = x_1)$.
- Conditional pdf:
$$f_{X_2|X_1}(x_2|x_1) = \frac{f(x_1, x_2)}{f_{X_1}(x_1)} = \frac{\frac{2}{\theta^2} e^{-\frac{x_1 + x_2}{\theta}}}{\frac{2}{\theta} e^{-\frac{2x_1}{\theta}}} = \frac{1}{\theta} e^{-\frac{x_2 - x_1}{\theta}}, \quad x_2 > x_1$$
- This is an exponential distribution shifted by $x_1$ with rate $1/\theta$.
- So,
$$E(X_2 | X_1 = x_1) = x_1 + \theta$$
(c) Find $E(X_1 X_2)$.
- Use law of total expectation:
$$E(X_1 X_2) = E\left[X_1 E(X_2 | X_1)\right] = E[X_1 (X_1 + \theta)] = E(X_1^2) + \theta E(X_1)$$
- For exponential with rate $\lambda = \frac{2}{\theta}$:
$$E(X_1) = \frac{1}{\lambda} = \frac{\theta}{2}$$
$$Var(X_1) = \frac{1}{\lambda^2} = \frac{\theta^2}{4}$$
- So,
$$E(X_1^2) = Var(X_1) + (E(X_1))^2 = \frac{\theta^2}{4} + \left(\frac{\theta}{2}\right)^2 = \frac{\theta^2}{4} + \frac{\theta^2}{4} = \frac{\theta^2}{2}$$
- Therefore,
$$E(X_1 X_2) = \frac{\theta^2}{2} + \theta \cdot \frac{\theta}{2} = \frac{\theta^2}{2} + \frac{\theta^2}{2} = \theta^2$$
**Final answers:**
- Q1: (a) FALSE, (b) TRUE, (c) FALSE, (d) TRUE
- Q2: (a) TRUE, (b) FALSE, (c) TRUE, (d) TRUE
- Q3: (a) $\frac{\theta}{2}$, (b) $x_1 + \theta$, (c) $\theta^2$