Time and Space Complexity

When we write an algorithm, we want it to be fast and use little memory. But “fast” is relative. Does it take 1 second or 1 hour? Does it run fast on a supercomputer but slow on a phone?

To compare algorithms objectively, we use Asymptotic Analysis, primarily represented by Big O Notation. This measures how the runtime or memory usage grows as the input size (n) increases.

1. Time Complexity

Time complexity isn’t about seconds or milliseconds. It’s about the number of operations an algorithm performs relative to the input size.

  • If an algorithm takes 10 steps for 10 items, and 20 steps for 20 items, it grows linearly.
  • If it takes 100 steps for 10 items, and 400 steps for 20 items, it grows quadratically (much faster!).

Common Time Complexities

Notation Name Example Growth Rate
O(1) Constant Accessing an array index Fastest
O(log n) Logarithmic Binary Search Very Fast
O(n) Linear Loop through a list Fast
O(n log n) Linearithmic Merge Sort Decent
O(n2) Quadratic Nested loops Slow
O(2n) Exponential Recursive Fibonacci Very Slow
O(n!) Factorial Traveling Salesperson Extremely Slow

2. Space Complexity

Space complexity measures the total memory space required by the algorithm, including both input space and auxiliary space (extra space used).

  • O(1) Space: Uses a fixed amount of memory regardless of input (e.g., swapping two variables).
  • O(n) Space: Uses memory proportional to input size (e.g., creating a copy of a list).

3. Interactive Complexity Graph

Visualize how different complexities grow as the input size (N) increases. Notice how O(n2) and O(2n) explode quickly!

● O(1) ● O(log n) ● O(n) ● O(n2)

4. How to Determine Complexity

To find the time complexity, count the operations in the worst-case scenario.

Example 1: Simple Loop (Linear)

// O(n) Time
public void printAll(int[] arr) {
  for (int i = 0; i < arr.length; i++) { // Runs n times
    System.out.println(arr[i]);        // O(1) operation
  }
}

Example 2: Nested Loops (Quadratic)

// O(n^2) Time
func printPairs(arr []int) {
  n := len(arr)
  for i := 0; i < n; i++ {           // Runs n times
    for j := 0; j < n; j++ {       // Runs n times for EACH i
      fmt.Println(arr[i], arr[j])
    }
  }
}

[!TIP] Constants are dropped in Big O. O(2n) becomes O(n). O(500) becomes O(1). We care about the rate of growth, not the exact number of steps.

5. Trade-off: Time vs. Space

Often, you can make an algorithm faster by using more memory (Space-Time Tradeoff).

  • Caching/Memoization: Store results to avoid re-calculating (Uses O(n) space to get O(n) time instead of O(2n)).
  • Hash Maps: Use a map to check for existence in O(1) time, but it costs O(n) space.

Next Steps

Now that we have the vocabulary, let’s dive deeper into formal Asymptotic Analysis to understand Best, Average, and Worst cases.


6. Deep Dive Strategy Lab: Time And Space Complexity

Intuition Through Analogy

Think of this chapter like debugging a production outage timeline. The goal is not to memorize a fixed trick, but to repeatedly answer:

  1. What is the bottleneck?
  2. Which constraint dominates (time, memory, latency, correctness)?
  3. Which representation makes the bottleneck easier to eliminate?