Miscellaneous
Complexity Analysis
How to analyze time and space complexity
Big O Notation
Big O describes the upper bound of growth rate as input size approaches infinity. We drop constants and lower-order terms.
| Complexity | Name | Example |
|---|---|---|
| Constant | Array access, hash lookup | |
| Logarithmic | Binary search | |
| Linear | Single loop | |
| Linearithmic | Merge sort, heap sort | |
| Quadratic | Nested loops | |
| Exponential | Subsets, backtracking | |
| Factorial | Permutations |
Time Complexity
Single Loop
for i in range(n): # O(n)
# O(1) workNested Loops
for i in range(n): # O(n)
for j in range(n): # O(n)
# O(1) work
# Total: O(n^2)Dependent loops:
for i in range(n): # O(n)
for j in range(i): # O(i) on average = O(n/2)
# O(1) work
# Total: O(n^2) — sum of 1+2+...+n = n(n+1)/2Loop with Multiplicative Step
i = 1
while i < n:
i *= 2 # doubles each iteration
# Iterations: log₂(n) → O(log n)Loop with Division
while n > 0:
n //= 2 # halves each iteration
# Iterations: log₂(n) → O(log n)Multiple Sequential Loops
for i in range(n): # O(n)
pass
for i in range(m): # O(m)
pass
# Total: O(n + m)Recursion
Linear recursion:
def f(n):
if n <= 0:
return
f(n - 1) # single recursive call
# O(n) callsBinary recursion (divide and conquer):
def f(n):
if n <= 1:
return
f(n // 2)
f(n // 2)
# Recurrence: T(n) = 2T(n/2) + O(1)
# By Master Theorem: O(n)Exponential recursion:
def fib(n):
if n <= 1:
return n
return fib(n-1) + fib(n-2)
# O(2^n) — two branches at each levelMaster Theorem
For recurrences of the form :
| Condition | Complexity |
|---|---|
Common examples:
| Algorithm | Recurrence | Result |
|---|---|---|
| Binary search | ||
| Merge sort | ||
| Karatsuba |
Space Complexity
Space complexity = auxiliary space only (extra space used by the algorithm).
Exclude:
- Input space (given to you)
- Output space (required regardless of approach)
Examples
def sum_array(A):
total = 0 # O(1) auxiliary
for v in A:
total += v
return total
# Space: O(1)def get_squares(A):
return [v * v for v in A] # output array
# Space: O(1) — output doesn't countdef merge_sort(A):
if len(A) <= 1:
return A
mid = len(A) // 2
left = merge_sort(A[:mid]) # creates copy
right = merge_sort(A[mid:]) # creates copy
return merge(left, right)
# Space: O(n) — temporary arrays at each levelRecursion Stack Space
Recursive calls use stack space:
def factorial(n):
if n <= 1:
return 1
return n * factorial(n - 1)
# Space: O(n) — n stack framesTail recursion (if optimized): space.
Data Structure Operations
| Structure | Access | Search | Insert | Delete |
|---|---|---|---|---|
| Array | ||||
| Dynamic Array | * | |||
| Linked List | ||||
| Hash Table | — | * | * | * |
| BST (balanced) | — | |||
| Heap | — |
*Amortized or average case
Amortized Analysis
Amortized cost = average cost per operation over a sequence of operations.
Dynamic Array (ArrayList)
- Doubling strategy: when full, allocate 2x space and copy
- Single insert can be (when resizing)
- But inserts cost total → amortized per insert
Proof: After inserts, total copy operations = .
Two-Pointer Patterns
left, right = 0, 0
while right < n:
# expand window
right += 1
while invalid:
left += 1
# Each pointer moves at most n times
# Total: O(n), not O(n^2)Common Pitfalls
| Code | Appears | Actually |
|---|---|---|
str += char in loop | — strings are immutable | |
list.insert(0, x) in loop | — shifts all elements | |
x in list in loop | — use set for lookup | |
Slicing A[i:j] | — creates copy | |
list.sort() / sorted() | space | space — Timsort uses auxiliary storage |
Quick Reference
Identify the pattern:
| Pattern | Time |
|---|---|
| Fixed iterations | |
| Halving/doubling | |
| Single pass | |
| Sort then process | |
| Nested dependent loops | |
| All pairs | |
| All subsets | |
| All permutations |