0% completed
The Master Theorem is a formula that provides a shortcut to solve recurrence relations, specifically those that arise in recursive algorithms that divide a problem into smaller, equally sized subproblems. Instead of expanding and summing up each level of the recursion, the Master Theorem allows us to calculate the time complexity directly from the recurrence relation.
Applicable Problems for the Master Theorem
The Master Theorem is applicable to recurrence relations of the following form:
T(n) = a \cdot T\left(\frac{n}{b}\right) + f(n)
Where:
- a: The number of subproblems generated at each level.
- b: The factor by which the problem size is divided in each recursive call.
- f(n): The work done outside the recursive calls (usually at each level of recursion).
When to Use the Master Theorem
The Master Theorem is ideal for analyzing algorithms that:
- Divide Problems Equally: The problem size is divided into equal parts in each recursive call (like in merge sort or binary search).
- Have Uniform Work per Level: The work done at each level, represented by f(n), follows a standard form (e.g., polynomial).
When Not to Use It
The Master Theorem does not apply if:
- The subproblems are of unequal size.
- The recurrence does not match the required form or has irregular work at different levels.
Master Theorem Cases
The Master Theorem defines three cases based on the relationship between f(n) and n^{\log_b a}. Each case provides a different time complexity.
-
Case 1: f(n) = O(n^{\log_b a - \epsilon}) for some \epsilon > 0
- Meaning: The function f(n) grows slower than n^{\log_b a}.
- Result: The total complexity is dominated by the work done at the leaf nodes of the recursion tree.
- Complexity: T(n) = \Theta(n^{\log_b a})
-
Case 2: f(n) = \Theta(n^{\log_b a})
- Meaning: The function f(n) grows at the same rate as n^{\log_b a}.
- Result: The work is evenly balanced across levels of the recursion tree.
- Complexity: T(n) = \Theta(n^{\log_b a} \cdot \log n)
-
Case 3: f(n) = \Omega(n^{\log_b a + \epsilon}) for some \epsilon > 0
- Meaning: The function f(n) grows faster than n^{\log_b a}.
- Condition: If a \cdot f\left(\frac{n}{b}\right) \leq c \cdot f(n) for some constant c < 1 and sufficiently large n, then we can apply this case.
- Result: The work done at the root dominates the total complexity.
- Complexity: T(n) = \Theta(f(n))
Example: Analyzing Merge Sort with the Master Theorem
Merge sort is a classic divide-and-conquer algorithm that sorts an array by splitting it into two halves, sorting each half, and then merging them. Let’s analyze its time complexity using the Master Theorem.
Merge Sort Recurrence Relation
The recurrence relation for merge sort is:
T(n) = 2 \cdot T\left(\frac{n}{2}\right) + O(n)
- a = 2: Each level generates two subproblems.
- b = 2: The input size is halved at each recursive level.
- f(n) = O(n): Merging two halves requires linear work.
Applying the Master Theorem
- Calculate n^{\log_b a}:
n^{\log_b a} = n^{\log_2 2} = n^1 = n
-
Identify the Case:
- Since f(n) = O(n) matches n^{\log_b a} = n, this corresponds to Case 2.
-
Result for Case 2:
T(n) = \Theta(n \cdot \log n)
Thus, the time complexity for merge sort is O(n \log n).
Summary of the Master Theorem
Case | Condition | Complexity |
---|---|---|
Case 1 | f(n) = O(n^{\log_b a - \epsilon}) | T(n) = \Theta(n^{\log_b a}) |
Case 2 | f(n) = \Theta(n^{\log_b a}) | T(n) = \Theta(n^{\log_b a} \cdot \log n) |
Case 3 | f(n) = \Omega(n^{\log_b a + \epsilon}) | T(n) = \Theta(f(n)) |
.....
.....
.....
Table of Contents
Contents are not accessible
Contents are not accessible
Contents are not accessible
Contents are not accessible
Contents are not accessible