CP
Miscellaneous

Complexity Analysis

How to analyze time and space complexity

Big O Notation

Big O describes the upper bound of growth rate as input size nn approaches infinity. We drop constants and lower-order terms.

ComplexityNameExample
O(1)O(1)ConstantArray access, hash lookup
O(logn)O(\log n)LogarithmicBinary search
O(n)O(n)LinearSingle loop
O(nlogn)O(n \log n)LinearithmicMerge sort, heap sort
O(n2)O(n^2)QuadraticNested loops
O(2n)O(2^n)ExponentialSubsets, backtracking
O(n!)O(n!)FactorialPermutations

Time Complexity

Single Loop

for i in range(n):  # O(n)
    # O(1) work

Nested Loops

for i in range(n):      # O(n)
    for j in range(n):  # O(n)
        # O(1) work
# Total: O(n^2)

Dependent loops:

for i in range(n):          # O(n)
    for j in range(i):      # O(i) on average = O(n/2)
        # O(1) work
# Total: O(n^2) — sum of 1+2+...+n = n(n+1)/2

Loop with Multiplicative Step

i = 1
while i < n:
    i *= 2  # doubles each iteration
# Iterations: log₂(n) → O(log n)

Loop with Division

while n > 0:
    n //= 2  # halves each iteration
# Iterations: log₂(n) → O(log n)

Multiple Sequential Loops

for i in range(n):    # O(n)
    pass
for i in range(m):    # O(m)
    pass
# Total: O(n + m)

Recursion

Linear recursion:

def f(n):
    if n <= 0:
        return
    f(n - 1)  # single recursive call
# O(n) calls

Binary recursion (divide and conquer):

def f(n):
    if n <= 1:
        return
    f(n // 2)
    f(n // 2)
# Recurrence: T(n) = 2T(n/2) + O(1)
# By Master Theorem: O(n)

Exponential recursion:

def fib(n):
    if n <= 1:
        return n
    return fib(n-1) + fib(n-2)
# O(2^n) — two branches at each level

Master Theorem

For recurrences of the form T(n)=aT(n/b)+O(nd)T(n) = aT(n/b) + O(n^d):

ConditionComplexity
d>logbad > \log_b aO(nd)O(n^d)
d=logbad = \log_b aO(ndlogn)O(n^d \log n)
d<logbad < \log_b aO(nlogba)O(n^{\log_b a})

Common examples:

AlgorithmRecurrenceResult
Binary searchT(n)=T(n/2)+O(1)T(n) = T(n/2) + O(1)O(logn)O(\log n)
Merge sortT(n)=2T(n/2)+O(n)T(n) = 2T(n/2) + O(n)O(nlogn)O(n \log n)
KaratsubaT(n)=3T(n/2)+O(n)T(n) = 3T(n/2) + O(n)O(n1.58)O(n^{1.58})

Space Complexity

Space complexity = auxiliary space only (extra space used by the algorithm).

Exclude:

  • Input space (given to you)
  • Output space (required regardless of approach)

Examples

def sum_array(A):
    total = 0           # O(1) auxiliary
    for v in A:
        total += v
    return total
# Space: O(1)
def get_squares(A):
    return [v * v for v in A]  # output array
# Space: O(1) — output doesn't count
def merge_sort(A):
    if len(A) <= 1:
        return A
    mid = len(A) // 2
    left = merge_sort(A[:mid])   # creates copy
    right = merge_sort(A[mid:])  # creates copy
    return merge(left, right)
# Space: O(n) — temporary arrays at each level

Recursion Stack Space

Recursive calls use stack space:

def factorial(n):
    if n <= 1:
        return 1
    return n * factorial(n - 1)
# Space: O(n) — n stack frames

Tail recursion (if optimized): O(1)O(1) space.

Data Structure Operations

StructureAccessSearchInsertDelete
ArrayO(1)O(1)O(n)O(n)O(n)O(n)O(n)O(n)
Dynamic ArrayO(1)O(1)O(n)O(n)O(1)O(1)*O(n)O(n)
Linked ListO(n)O(n)O(n)O(n)O(1)O(1)O(1)O(1)
Hash TableO(1)O(1)*O(1)O(1)*O(1)O(1)*
BST (balanced)O(logn)O(\log n)O(logn)O(\log n)O(logn)O(\log n)
HeapO(n)O(n)O(logn)O(\log n)O(logn)O(\log n)

*Amortized or average case

Amortized Analysis

Amortized cost = average cost per operation over a sequence of operations.

Dynamic Array (ArrayList)

  • Doubling strategy: when full, allocate 2x space and copy
  • Single insert can be O(n)O(n) (when resizing)
  • But nn inserts cost O(n)O(n) total → O(1)O(1) amortized per insert

Proof: After nn inserts, total copy operations = 1+2+4+...+n<2n=O(n)1 + 2 + 4 + ... + n < 2n = O(n).

Two-Pointer Patterns

left, right = 0, 0
while right < n:
    # expand window
    right += 1
    while invalid:
        left += 1
# Each pointer moves at most n times
# Total: O(n), not O(n^2)

Common Pitfalls

CodeAppearsActually
str += char in loopO(n)O(n)O(n2)O(n^2) — strings are immutable
list.insert(0, x) in loopO(n)O(n)O(n2)O(n^2) — shifts all elements
x in list in loopO(n)O(n)O(n2)O(n^2) — use set for O(1)O(1) lookup
Slicing A[i:j]O(1)O(1)O(ji)O(j-i) — creates copy
list.sort() / sorted()O(1)O(1) spaceO(n)O(n) space — Timsort uses auxiliary storage

Quick Reference

Identify the pattern:

PatternTime
Fixed iterationsO(1)O(1)
Halving/doublingO(logn)O(\log n)
Single passO(n)O(n)
Sort then processO(nlogn)O(n \log n)
Nested dependent loopsO(n2)O(n^2)
All pairsO(n2)O(n^2)
All subsetsO(2n)O(2^n)
All permutationsO(n!)O(n!)

On this page