Load factor is roughly `items / buckets`. When it gets too high, collisions increase and operations slow down. Resizing increases bucket count and rehashes entries to keep average performance near O(1).
Advanced answer
Deep dive
Expanding on the short answer — what usually matters in practice:
Complexity: compare typical operations (average vs worst-case).
Invariants: what must always hold for correctness.
When the choice is wrong: production symptoms (latency, GC, cache misses).
Explain the "why", not just the "what" (intuition + consequences).
Trade-offs: what you gain/lose (time, memory, complexity, risk).
Edge cases: empty inputs, large inputs, invalid inputs, concurrency.
Examples
A tiny example (an explanation template):
// Example: discuss trade-offs for "hash-table-load-factor-—-what-is-it-and-why-does"
function explain() {
// Start from the core idea:
// Load factor is roughly `items / buckets`. When it gets too high, collisions increase and o
}
Common pitfalls
Too generic: no concrete trade-offs or examples.
Mixing average-case and worst-case (e.g., complexity).