Real-World DSA: Caching Systems (LRU, LFU, TTL)
📖 Concept
Caching is everywhere — browsers, databases, CDNs, APIs. Understanding cache data structures is critical for system design AND coding interviews.
LRU Cache (Least Recently Used):
- Evicts the least recently accessed item when full
- Implementation: Hash Map + Doubly Linked List
- get() and put() both O(1)
- Used in: CPU caches, database query caches, browser caches
LFU Cache (Least Frequently Used):
- Evicts the least frequently accessed item
- Implementation: Hash Map + Frequency Map + Doubly Linked Lists
- More complex but better for some access patterns
Cache with TTL (Time To Live):
- Each entry expires after a set duration
- Implementation: Hash Map + Min-Heap (keyed by expiration time)
- Used in: DNS caching, session tokens, API rate limiting
Cache replacement policies:
| Policy | Evicts | Best For |
|---|---|---|
| LRU | Least recently used | General purpose, temporal locality |
| LFU | Least frequently used | Frequency-based access patterns |
| FIFO | Oldest item | Simple, predictable |
| Random | Random item | Simple, surprisingly effective |
System Design implications:
- Cache hit ratio → directly impacts latency
- Cache invalidation is one of the two hardest problems in CS
- Write-through vs write-back strategies
- Distributed caching (Redis, Memcached)
🏠 Real-world analogy: LRU is like your browser's "Recent" bookmarks — pages you haven't visited recently fall off. LFU is like a library keeping popular books on the front shelf — rarely borrowed books go to storage.
💻 Code Example
1// LRU CACHE — O(1) get and put2class LRUCache {3 constructor(capacity) {4 this.capacity = capacity;5 this.map = new Map(); // Preserves insertion order in JS!6 }78 get(key) {9 if (!this.map.has(key)) return -1;10 const val = this.map.get(key);11 this.map.delete(key); // Remove12 this.map.set(key, val); // Re-insert (moves to end = most recent)13 return val;14 }1516 put(key, value) {17 if (this.map.has(key)) this.map.delete(key);18 this.map.set(key, value);19 if (this.map.size > this.capacity) {20 // Delete oldest (first key in Map)21 const oldest = this.map.keys().next().value;22 this.map.delete(oldest);23 }24 }25}2627// LRU CACHE — Classic DLL + HashMap implementation28class DLLNode {29 constructor(key, val) { this.key = key; this.val = val; this.prev = null; this.next = null; }30}3132class LRUCacheClassic {33 constructor(capacity) {34 this.cap = capacity;35 this.map = new Map();36 this.head = new DLLNode(0, 0); // Dummy head (most recent)37 this.tail = new DLLNode(0, 0); // Dummy tail (oldest)38 this.head.next = this.tail;39 this.tail.prev = this.head;40 }4142 _remove(node) {43 node.prev.next = node.next;44 node.next.prev = node.prev;45 }4647 _addToFront(node) {48 node.next = this.head.next;49 node.prev = this.head;50 this.head.next.prev = node;51 this.head.next = node;52 }5354 get(key) {55 if (!this.map.has(key)) return -1;56 const node = this.map.get(key);57 this._remove(node);58 this._addToFront(node);59 return node.val;60 }6162 put(key, value) {63 if (this.map.has(key)) {64 this._remove(this.map.get(key));65 }66 const node = new DLLNode(key, value);67 this._addToFront(node);68 this.map.set(key, node);69 if (this.map.size > this.cap) {70 const lru = this.tail.prev;71 this._remove(lru);72 this.map.delete(lru.key);73 }74 }75}7677// CACHE WITH TTL78class TTLCache {79 constructor() { this.cache = new Map(); }8081 set(key, value, ttlMs) {82 const expiresAt = Date.now() + ttlMs;83 this.cache.set(key, { value, expiresAt });84 }8586 get(key) {87 const entry = this.cache.get(key);88 if (!entry) return undefined;89 if (Date.now() > entry.expiresAt) {90 this.cache.delete(key);91 return undefined;92 }93 return entry.value;94 }95}
🏋️ Practice Exercise
Practice Problems & Design Questions:
- Implement LRU Cache with O(1) get and put
- Implement LFU Cache with O(1) get and put
- Add TTL (time-to-live) to your LRU cache
- Design a multi-level cache (L1 memory, L2 Redis, L3 database)
- How would you handle cache invalidation in a distributed system?
- Implement a write-through cache vs write-back cache
- Design a CDN caching strategy for static assets
- Calculate cache hit ratio given access patterns
- When would you choose LFU over LRU?
- Design a rate limiter using a sliding window cache
⚠️ Common Mistakes
Using only a hash map for LRU — need O(1) removal of oldest, which requires a linked list
Not handling put() for existing keys — must update AND move to most recent
Forgetting cache stampede — multiple threads see cache miss and all hit the database
Not considering cache invalidation strategy — stale data can cause serious bugs
Over-caching — caching rarely accessed data wastes memory; monitor hit ratios
💼 Interview Questions
🎤 Mock Interview
Practice a live interview for Real-World DSA: Caching Systems (LRU, LFU, TTL)