Master Theorem
1. Let's start by stating the problem: You want to understand the Master Theorem, which is used to analyze the time complexity of divide-and-conquer algorithms.
2. The Master Theorem applies to recurrences of the form $$T(n) = aT\left(\frac{n}{b}\right) + f(n)$$ where:
- $a \geq 1$ is the number of subproblems,
- $b > 1$ is the factor by which the problem size is divided,
- $f(n)$ is the cost outside the recursive calls.
3. The theorem compares $f(n)$ with $n^{\log_b a}$ to determine the asymptotic behavior of $T(n)$:
- Case 1: If $f(n) = O\left(n^{\log_b a - \epsilon}\right)$ for some $\epsilon > 0$, then $$T(n) = \Theta\left(n^{\log_b a}\right).$$
- Case 2: If $f(n) = \Theta\left(n^{\log_b a} \log^k n\right)$ for some $k \geq 0$, then $$T(n) = \Theta\left(n^{\log_b a} \log^{k+1} n\right).$$
- Case 3: If $f(n) = \Omega\left(n^{\log_b a + \epsilon}\right)$ for some $\epsilon > 0$ and if $af\left(\frac{n}{b}\right) \leq cf(n)$ for some constant $c < 1$ and sufficiently large $n$, then $$T(n) = \Theta(f(n)).$$
4. Important rules:
- The Master Theorem only applies if $a \geq 1$ and $b > 1$.
- The function $f(n)$ must be positive and asymptotically positive.
- The regularity condition in Case 3 ensures the function $f(n)$ dominates the recurrence.
5. Example: Suppose $$T(n) = 2T\left(\frac{n}{2}\right) + n.$$ Here, $a=2$, $b=2$, and $f(n) = n$.
Calculate $n^{\log_b a} = n^{\log_2 2} = n^1 = n$.
Since $f(n) = \Theta(n^{\log_b a})$, this is Case 2 with $k=0$.
Therefore, $$T(n) = \Theta(n \log n).$$
6. This means the time complexity grows proportionally to $n \log n$.
Understanding these steps will help you analyze many divide-and-conquer algorithms efficiently.