0% completed
The Recurrence Relation Method allows us to express the time complexity of a recursive algorithm as a mathematical equation, or recurrence relation, based on the size of the input.
Recurrence relations are equations that express the time complexity of a recursive function in terms of the function itself. This method is useful for algorithms where each recursive call breaks the problem down into smaller subproblems until a base case is reached.
Example: Factorial Function
Let’s analyze the time complexity of a recursive function that calculates the factorial of a number, n!
. The factorial of n
is defined as the product of all positive integers up to n
. We can calculate it using a recursive approach as shown below.
Factorial Code
- Base Case: If
n
is 0 or 1, the function returns 1. This is the stopping condition, preventing infinite recursion. - Recursive Case: For values of
n
greater than 1, the function calls itself withn - 1
, multiplying the result byn
each time.
Setting Up the Recurrence Relation
To analyze the time complexity of this recursive function, let’s set up a recurrence relation.
Let:
- T(n) represent the time complexity of calculating the factorial of n.
The recurrence relation for factorial(n)
is:
T(n) = T(n - 1) + O(1)
Here:
- T(n - 1): Represents the time complexity of the recursive call, where the function computes
factorial(n - 1)
. - O(1): Represents the constant time needed for the multiplication operation n \times \text{factorial}(n - 1).
Solving the Recurrence Relation
Let’s expand this relation step-by-step:
Expanding these terms recursively until we reach the base case, T(1):
T(n) = T(n - 1) + O(1) = T(n - 2) + 2 \times O(1) = T(n - 3) + 3 \times O(1) = \dots = T(1) + (n - 1) \times O(1)
Since T(1) is a constant, the total complexity of T(n) becomes:
T(n) = O(n)
The time complexity of the factorial function is O(n), meaning the function requires linear time relative to the input size n.
Binary Search Recurrence Relation Analysis
In the previous lesson, we found the time complexity of the binary search by using the recursion tree. Here, we will find the time complexity of the binary search algorithm using the Recurrence Relation Analysis.
In binary search, each recursive call reduces the search range by half. This halving process continues until the range is reduced to a single element, or the base case is reached. Let’s break down the time complexity of this algorithm by setting up a recurrence relation.
Binary Search
Here’s the Python code for binary search, along with comments explaining each step.
Setting Up the Recurrence Relation
To analyze the time complexity, let’s define:
- T(n) as the time complexity for binary search on an array of size n.
Since each call reduces the search range by half, we can represent this in our recurrence relation as follows:
T(n) = T\left(\frac{n}{2}\right) + O(1)
Where:
- T\left(\frac{n}{2}\right): Represents the time complexity for a recursive call on half the original input size.
- O(1): Represents the constant time taken to calculate the middle index and perform the comparison with the target.
Expanding the Recurrence Relation
To solve for T(n), let’s expand the recurrence relation by substituting until we reach the base case:
The recursion stops when the input size is reduced to 1, so we set \frac{n}{2^k} = 1. Solving for k, we get:
n = 2^k \Rightarrow k = \log_2(n)
Thus, the recursion tree has O(\log n) levels.
Final Complexity Calculation
Since each level requires O(1) work and there are O(\log n) levels, the total time complexity of binary search is:
T(n) = O(1) \times O(\log n) = O(\log n)
.....
.....
.....
Table of Contents
Contents are not accessible
Contents are not accessible
Contents are not accessible
Contents are not accessible
Contents are not accessible