Time complexity is a measure of the amount of time an algorithm takes to complete as a function of the input size. It helps in analyzing and predicting the performance of algorithms.
This guide will walk you through common time complexities, how to calculate them, and examples of problems where these complexities arise.
An algorithm with constant time complexity will always execute in the same amount of time regardless of the input size.
- Accessing an element from an array using its index:
arr[5]
- Pushing or popping elements from a stack or queue.
# Example of O(1)
def constant_example(arr):
return arr[0] # Accessing an element from the array (indexing is O(1))
Algorithms with logarithmic time complexity reduce the problem size with each step. This often happens with divide-and-conquer algorithms.
- Binary Search
- Finding an element in a balanced binary search tree (BST)
# Example of O(log n)
def binary_search(arr, target):
low, high = 0, len(arr) - 1
while low <= high:
mid = (low + high) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
low = mid + 1
else:
high = mid - 1
return -1
Linear time complexity means that the running time increases linearly with the size of the input. This is common in algorithms that involve iterating over the input once.
- Finding the maximum element in an array.
- Linear Search
# Example of O(n)
def find_max(arr):
max_val = arr[0]
for num in arr:
if num > max_val:
max_val = num
return max_val
This complexity arises in algorithms that involve both dividing the input and iterating over it. Many efficient sorting algorithms exhibit this complexity.
- Merge Sort
- Quick Sort (average case)
# Example of O(n log n)
def merge_sort(arr):
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left = merge_sort(arr[:mid])
right = merge_sort(arr[mid:])
return merge(left, right)
def merge(left, right):
result = []
i = j = 0
while i < len(left) and j < len(right):
if left[i] < right[j]:
result.append(left[i])
i += 1
else:
result.append(right[j])
j += 1
result.extend(left[i:])
result.extend(right[j:])
return result
Quadratic time complexity often arises from algorithms that involve nested loops, where each element in the input is compared to every other element.
- Bubble Sort
- Insertion Sort
- Checking for duplicates in an unsorted array using brute force.
# Example of O(n^2)
def bubble_sort(arr):
n = len(arr)
for i in range(n):
for j in range(0, n-i-1):
if arr[j] > arr[j+1]:
arr[j], arr[j+1] = arr[j+1], arr[j]
return arr
Cubic time complexity typically arises in algorithms that involve three nested loops.
- Multiplying two matrices.
- Solving the all-pairs shortest path using Floyd-Warshall algorithm.
# Example of O(n^3)
def matrix_multiply(A, B):
n = len(A)
result = [[0 for _ in range(n)] for _ in range(n)]
for i in range(n):
for j in range(n):
for k in range(n):
result[i][j] += A[i][k] * B[k][j]
return result
Exponential time complexity typically arises in recursive algorithms where the problem size doubles with each step. These algorithms are inefficient for large inputs.
- Recursive calculation of Fibonacci numbers.
- Solving the Traveling Salesman Problem using brute force.
# Example of O(2^n)
def fibonacci(n):
if n <= 1:
return n
else:
return fibonacci(n-1) + fibonacci(n-2)
Factorial time complexity usually arises in problems that involve generating all permutations or combinations of a set.
- Solving the Traveling Salesman Problem by generating all possible routes.
- Generating all permutations of a string.
# Example of O(n!)
from itertools import permutations
def all_permutations(s):
return list(permutations(s))
- Identify the basic operations: Focus on the fundamental operation (e.g., comparison, addition) that contributes the most to the running time.
- Analyze loops:
- A single loop contributes
O(n)
. - Nested loops contribute
O(n^2)
,O(n^3)
, etc.
- A single loop contributes
- Recursive algorithms: Use recurrence relations (e.g., Master Theorem) to calculate the complexity of divide-and-conquer algorithms.
- Drop constants and non-dominant terms: When expressing time complexity, focus on the term that grows the fastest as input size increases.
- Confusing worst-case, best-case, and average-case complexities.
- Overlooking hidden constant factors (they matter for practical performance but are ignored in asymptotic analysis).
Time complexity is a crucial aspect of algorithm design and analysis. It helps developers evaluate the efficiency of their algorithms and choose the best one for a given problem. While constant time algorithms are the most efficient, many real-world problems require trade-offs between time complexity and space complexity.
Understanding time complexity allows you to make informed decisions and optimize your code for better performance.
0004-median-of-two-sorted-arrays |
0169-majority-element |
0169-majority-element |
2163-kth-distinct-string-in-an-array |
3276-minimum-number-of-pushes-to-type-word-ii |
0054-spiral-matrix |
0921-spiral-matrix-iii |
1806-count-of-matches-in-tournament |
1916-find-center-of-star-graph |
0136-single-number |
0137-single-number-ii |
0260-single-number-iii |
1557-check-if-a-string-contains-all-binary-codes-of-size-k |
1557-check-if-a-string-contains-all-binary-codes-of-size-k |
1557-check-if-a-string-contains-all-binary-codes-of-size-k |
0121-best-time-to-buy-and-sell-stock |
0392-is-subsequence |
0678-valid-parenthesis-string |
1013-fibonacci-number |
0273-integer-to-english-words |
1013-fibonacci-number |
1013-fibonacci-number |
0004-median-of-two-sorted-arrays |
0167-two-sum-ii-input-array-is-sorted |
0209-minimum-size-subarray-sum |
1615-range-sum-of-sorted-subarray-sums |
0678-valid-parenthesis-string |
1737-maximum-nesting-depth-of-the-parentheses |
0678-valid-parenthesis-string |
3276-minimum-number-of-pushes-to-type-word-ii |
0812-rotate-string |
0014-longest-common-prefix |
0209-minimum-size-subarray-sum |
0238-product-of-array-except-self |
1537-maximum-score-after-splitting-a-string |
0048-rotate-image |
0054-spiral-matrix |
0870-magic-squares-in-grid |
0921-spiral-matrix-iii |