Grokking Algorithm Complexity and Big-O
Ask Author
Back to course home

0% completed

Vote For New Content
Sorting Algorithms
Table of Contents

Contents are not accessible

Contents are not accessible

Contents are not accessible

Contents are not accessible

Contents are not accessible

In this lesson, we’ll explore four different sorting algorithms:

  • Bubble Sort
  • Merge Sort
  • Quick Sort
  • Counting Sort.

We’ll provide code for each algorithm, followed by a detailed complexity analysis. Each algorithm will be evaluated based on its time and space complexity, helping you understand when to use each one.

1. Bubble Sort

Bubble Sort is a simple comparison-based sorting algorithm that repeatedly steps through the list, compares adjacent elements, and swaps them if they are in the wrong order.

Python3
Python3

. . . .

Time Complexity for Bubble Sort

  • Outer loop: The outer loop runs N times, where N is the length of the input array. This contributes a time complexity of O(N) for the outer loop.
  • Inner loop (nested): For each iteration of the outer loop, the inner loop runs N - i - 1 times. In the first iteration, the inner loop runs approximately N - 1 times, in the second iteration N - 2 times, and so on. The total number of comparisons made can be expressed as:

(N-1) + (N-2) + (N-3) + \dots + 1 = \frac{N(N-1)}{2}

This simplifies to O(N^2).

Overall time complexity: O(N^2). The nested loops dominate the runtime, resulting in quadratic time complexity.

Space Complexity for Bubble Sort

  • The algorithm only uses a few variables (i, j, and temporary variables for swapping), which require constant space.
  • No additional data structures are used that depend on the input size.

Overall space complexity: O(1). Bubble Sort is an in-place sorting algorithm, so it only requires a constant amount of extra space.

2. Merge Sort

Merge Sort is a divide-and-conquer algorithm that splits the array into halves, recursively sorts each half, and then merges them back together in sorted order.

Python3
Python3

. . . .

Time Complexity for Merge Sort

  1. Divide Step:

    • The mergeSort function splits the array into two halves.
    • Time Complexity for Division: O(1) for calculating mid and slicing arrays.
    • This division step is performed at each recursive level, with each level having approximately \log N divisions for an input of size N.
  2. Recursive Calls:

    • The function recursively calls mergeSort on both the left and right halves.
    • Number of Levels: The array is divided until each subarray has one element, resulting in approximately \log N levels of recursion.
  3. Merge Step:

    • The merging process iterates over both halves and sorts them.
    • Time Complexity for Merging: O(N) at each level since all elements are processed once during merging.
    • Each level processes the entire array of size N.

Overall Time Complexity:

  • The merging process runs O(N) at each level, and there are \log N levels of recursion: O(N \log N)

Space Complexity for Merge Sort

  1. Auxiliary Space for Left and Right Halves:

    • At each level of recursion, space is used for creating the left and right subarrays (arr[:mid] and arr[mid:]).
    • This leads to O(N) additional space for splitting arrays at each level of recursion.
  2. Recursive Call Stack:

    • The depth of the recursion stack is O(\log N) due to the binary splitting.
    • Each call adds a frame to the stack, but these do not contribute to more than O(\log N) space.

Overall Space Complexity:

  • The merge process and auxiliary space for storing left and right halves contribute to O(N).
  • The recursive stack contributes O(\log N) space.

Total Space Complexity: O(N)

3. Quick Sort

Quick Sort is a highly efficient sorting algorithm that picks a pivot and partitions the array into subarrays, sorting them recursively.

Python3
Python3

. . . .

Time Complexity for Quick Sort

  1. Partition Step:

    • The array is partitioned into left, middle, and right subarrays based on the pivot.
    • Time Complexity for Partitioning: O(N) for each call as it requires iterating through the array to build the subarrays left, middle, and right.
  2. Recursive Calls:

    • The quickSort function is recursively called on left and right subarrays.
    • Best and Average Case:
      • If the pivot splits the array into two roughly equal parts, each call processes \frac{N}{2} elements, leading to \log N levels of recursion.
      • Total Time Complexity: O(N \log N).
    • Worst Case:
      • If the pivot is the smallest or largest element (e.g., already sorted or reverse-sorted arrays), one partition will have N-1 elements, leading to N levels of recursion.
      • Total Time Complexity in Worst Case: O(N^2).

Overall Time Complexity:

  • Best and Average Case: O(N \log N).
  • Worst Case: O(N^2).

Space Complexity for Quick Sort

  1. Auxiliary Space:

    • The partitioning step creates new subarrays (left, middle, right) at each level, requiring additional space proportional to N for each partition.
  2. Recursive Call Stack:

    • The depth of the recursion stack is O(\log N) in the best and average cases, but can be O(N) in the worst case for highly unbalanced splits.

Overall Space Complexity:

  • Best and Average Case: O(\log N) for the recursion stack.
  • Worst Case: O(N) due to the maximum recursion depth.

Total Space Complexity:

  • Auxiliary Space: O(N) for subarrays created at each level.

4. Counting Sort

Counting Sort is a non-comparison-based sorting algorithm that counts the occurrences of each element and uses this count to place them in the correct position.

Python3
Python3

. . . .

Time Complexity for Counting Sort

  1. Finding the Maximum Element (max(arr)):

    • Time Complexity: O(N) – This step iterates through the array once to find the maximum value.
  2. Creating the Count Array:

    • Time Complexity: O(K), where K is the value of the maximum element in the array. This step initializes an array of size K + 1 to zero.
  3. Counting the Occurrences:

    • The for loop that iterates through arr and increments the count in the count array runs in:
      • Time Complexity: O(N) – It iterates through each element in the input array once.
  4. Building the Sorted Array:

    • The second for loop iterates over the count array and extends sorted_arr based on the counts.
    • Time Complexity: O(K + N) – The loop runs K times, and extending sorted_arr takes O(N) across all iterations.

Overall Time Complexity:

  • The total time complexity is: O(N + K)

    This makes Counting Sort efficient when K (the range of input values) is not significantly larger than N.

Space Complexity for Counting Sort

  1. Count Array:

    • The count array is created with a size of K + 1, where K is the maximum element in arr.
    • Space Complexity: O(K) – This space depends on the range of input values.
  2. Sorted Array:

    • The sorted_arr is created to store the sorted output.
    • Space Complexity: O(N) – It stores the sorted elements of the input array.

Overall Space Complexity:

  • The total space complexity is: O(N + K)

Which Sorting Algorithm is Better?

  • Bubble Sort: Simple to implement but inefficient for large datasets (O(N^2)).
  • Merge Sort: Reliable O(N \log N) performance but uses O(N) space.
  • Quick Sort: Efficient for most cases with O(N \log N) time, but worst case is O(N^2). Uses less space (O(\log N)).
  • Counting Sort: Ideal for small ranges of integers; O(N + k) time but uses more space (O(k)).

Best Overall Choice: Quick Sort for general cases due to its efficiency and average O(N \log N) time. Merge Sort is a close second for guaranteed O(N \log N) performance, especially when space is not a concern.

.....

.....

.....

Like the course? Get enrolled and start learning!

Table of Contents

Contents are not accessible

Contents are not accessible

Contents are not accessible

Contents are not accessible

Contents are not accessible