Deep dive
Big-O is an asymptotic upper bound on how a resource (time or space) grows as input size increases.
Key nuances:
- It ignores constants and lower-order terms to focus on scaling.
- You should specify the case: worst-case, average-case, amortized.
- Big-O is an upper bound; Θ is a tight bound; Ω is a lower bound.
Examples
- O(n log n) sorting scales much better than O(n^2) for large n.
- Two O(n) algorithms can still differ a lot due to constants and memory access.
Common pitfalls
- Treating Big-O as exact runtime.
- Assuming same Big-O means same speed.
- Ignoring space complexity (often the real bottleneck).