Growth Rates and Asymptotic Analysis
Chapter 3 • Intermediate
Growth Rates and Asymptotic Analysis
Asymptotic Notation
Asymptotic notation describes the limiting behavior of functions as input size grows.
Big-O Notation (Upper Bound)
Definition: f(n) = O(g(n)) if there exist constants c > 0 and n₀ > 0 such that:
f(n) ≤ c·g(n) for all n ≥ n₀
Meaning: f(n) grows at most as fast as g(n)
Examples:
- 3n² + 2n + 1 = O(n²)
- 5n + 10 = O(n)
- 2ⁿ + n² = O(2ⁿ)
Big-Ω Notation (Lower Bound)
Definition: f(n) = Ω(g(n)) if there exist constants c > 0 and n₀ > 0 such that:
f(n) ≥ c·g(n) for all n ≥ n₀
Meaning: f(n) grows at least as fast as g(n)
Examples:
- n² = Ω(n)
- n log n = Ω(n)
- 2ⁿ = Ω(n²)
Big-Θ Notation (Tight Bound)
Definition: f(n) = Θ(g(n)) if f(n) = O(g(n)) and f(n) = Ω(g(n))
Meaning: f(n) grows at the same rate as g(n)
Examples:
- 2n² + 3n = Θ(n²)
- n log n + n = Θ(n log n)
Common Growth Rates
Ordered from fastest to slowest:
- O(1): Constant time
- Array access, hash table lookup
- O(log n): Logarithmic
- Binary search, balanced tree operations
- O(n): Linear
- Single loop through array
- O(n log n): Linearithmic
- Merge sort, heap sort, efficient sorting
- O(n²): Quadratic
- Nested loops, bubble sort, selection sort
- O(n³): Cubic
- Three nested loops, matrix multiplication
- O(2ⁿ): Exponential
- Recursive Fibonacci, subset generation
- O(n!): Factorial
- Permutations, traveling salesman (brute force)
Analyzing Algorithms
Example 1: Linear Search
def linear_search(arr, target):
for i in range(len(arr)):
if arr[i] == target:
return i
return -1
- Best case: O(1) - target at first position
- Worst case: O(n) - target not found or at end
- Average case: O(n) - target at middle on average
Example 2: Nested Loops
def print_pairs(arr):
for i in range(len(arr)):
for j in range(i+1, len(arr)):
print(arr[i], arr[j])
- Time: O(n²) - n(n-1)/2 pairs
- Space: O(1)
Example 3: Recursive Function
def fibonacci(n):
if n <= 1:
return n
return fibonacci(n-1) + fibonacci(n-2)
- Time: O(2ⁿ) - exponential growth
- Space: O(n) - recursion stack depth
Master Theorem
For recurrences of the form: T(n) = aT(n/b) + f(n)
Where:
- a ≥ 1 (number of subproblems)
- b > 1 (subproblem size reduction)
- f(n) is the work done outside recursion
Case 1: If f(n) = O(n^(log_b a - ε)) for some ε > 0
- Then T(n) = Θ(n^(log_b a))
Case 2: If f(n) = Θ(n^(log_b a))
- Then T(n) = Θ(n^(log_b a) · log n)
Case 3: If f(n) = Ω(n^(log_b a + ε)) and af(n/b) ≤ cf(n) for some c < 1
- Then T(n) = Θ(f(n))
Examples
- T(n) = 2T(n/2) + n
- a=2, b=2, f(n)=n
- log_b a = 1, f(n) = Θ(n¹)
- Case 2: T(n) = Θ(n log n)
- T(n) = 4T(n/2) + n²
- a=4, b=2, f(n)=n²
- log_b a = 2, f(n) = Θ(n²)
- Case 2: T(n) = Θ(n² log n)
- T(n) = T(n/2) + 1
- a=1, b=2, f(n)=1
- log_b a = 0, f(n) = Θ(1)
- Case 2: T(n) = Θ(log n)
Space Complexity
Space complexity measures memory usage:
- Auxiliary space: Extra space used by algorithm
- Total space: Input space + auxiliary space
Examples
- O(1): Constant space (iterative algorithms)
- O(n): Linear space (arrays, recursion stack)
- O(n²): Quadratic space (2D arrays)
Practice Problems
- Analyze time complexity of binary search
- Find complexity of merge sort using Master Theorem
- Compare space complexity of iterative vs recursive Fibonacci
- Determine complexity of nested loops with different bounds
- Apply Master Theorem to various recurrences