Optimization Techniques

[!NOTE] This module explores the core principles of Optimization Techniques, deriving solutions from first principles and hardware constraints to build world-class, production-ready expertise.

1. The Hook: “Can you do better?”

You solved the problem. The complexity is O(N \log N). The interviewer nods and asks: “Can we improve the time complexity?” Or: “Can we do this with less memory?” This is the moment that separates L3 (Junior) from L4/L5 (Senior). You need a toolbox of Optimization Patterns.


2. Pattern 1: Space-Time Trade-off

Mantra: Implement a Cache. If you need to calculate something repeatedly, compute it once and store it.

  • Hash Maps: Turn O(N) lookup into O(1). Cost: O(N) memory.
  • Prefix Sums: Turn range sum O(N) into O(1). Cost: O(N) memory.
  • Tries: Store strings for fast prefix traversal.

3. Pattern 2: Pre-Computation (The Pre-Processor)

If you have to answer Q queries, and the data is static: Sort the data first.

  • Unsorted: Q \cdot O(N) search. Total O(QN).
  • Sorted: O(N \log N) setup + Q \cdot O(\log N) binary search.
  • Win: If Q is large, sorting wins massive time.

4. Pattern 3: Modify In-Place (Memory Optimization)

If the interviewer asks for O(1) Space:

  • Can you reuse the input array?
  • Dutch National Flag: Sort 0s, 1s, 2s by swapping elements, using the array itself as buckets.
  • Linked List Cycle: Use Fast/Slow pointers (Floyd’s) instead of a HashSet.

5. Deep Dive Strategy Lab: Optimization

Intuition Through Analogy

The Chef’s Mise-en-place.

  • Naive: Chop an onion every time a customer orders Soup. (Slow, redundant).
  • Pre-computation: Chop 100 onions in the morning. (Setup cost, but fast service).
  • Space Trade-off: You need a big bowl (Memory) to hold chopped onions.

Mathematical Anchor: Amyl’s Law

Optimization has diminishing returns. If efficient code takes 5% of runtime, and DB queries take 95%, optimizing the code to be 1000 \times faster only gives you a 5% total speedup. Focus on the Bottleneck.