Real-World DSA: Caching Systems (LRU, LFU, TTL)

0/7 in this phase0/57 across the roadmap

📖 Concept

Caching is everywhere — browsers, databases, CDNs, APIs. Understanding cache data structures is critical for system design AND coding interviews.

LRU Cache (Least Recently Used):

  • Evicts the least recently accessed item when full
  • Implementation: Hash Map + Doubly Linked List
  • get() and put() both O(1)
  • Used in: CPU caches, database query caches, browser caches

LFU Cache (Least Frequently Used):

  • Evicts the least frequently accessed item
  • Implementation: Hash Map + Frequency Map + Doubly Linked Lists
  • More complex but better for some access patterns

Cache with TTL (Time To Live):

  • Each entry expires after a set duration
  • Implementation: Hash Map + Min-Heap (keyed by expiration time)
  • Used in: DNS caching, session tokens, API rate limiting

Cache replacement policies:

Policy Evicts Best For
LRU Least recently used General purpose, temporal locality
LFU Least frequently used Frequency-based access patterns
FIFO Oldest item Simple, predictable
Random Random item Simple, surprisingly effective

System Design implications:

  • Cache hit ratio → directly impacts latency
  • Cache invalidation is one of the two hardest problems in CS
  • Write-through vs write-back strategies
  • Distributed caching (Redis, Memcached)

🏠 Real-world analogy: LRU is like your browser's "Recent" bookmarks — pages you haven't visited recently fall off. LFU is like a library keeping popular books on the front shelf — rarely borrowed books go to storage.

💻 Code Example

codeTap to expand ⛶
1// LRU CACHE — O(1) get and put
2class LRUCache {
3 constructor(capacity) {
4 this.capacity = capacity;
5 this.map = new Map(); // Preserves insertion order in JS!
6 }
7
8 get(key) {
9 if (!this.map.has(key)) return -1;
10 const val = this.map.get(key);
11 this.map.delete(key); // Remove
12 this.map.set(key, val); // Re-insert (moves to end = most recent)
13 return val;
14 }
15
16 put(key, value) {
17 if (this.map.has(key)) this.map.delete(key);
18 this.map.set(key, value);
19 if (this.map.size > this.capacity) {
20 // Delete oldest (first key in Map)
21 const oldest = this.map.keys().next().value;
22 this.map.delete(oldest);
23 }
24 }
25}
26
27// LRU CACHE — Classic DLL + HashMap implementation
28class DLLNode {
29 constructor(key, val) { this.key = key; this.val = val; this.prev = null; this.next = null; }
30}
31
32class LRUCacheClassic {
33 constructor(capacity) {
34 this.cap = capacity;
35 this.map = new Map();
36 this.head = new DLLNode(0, 0); // Dummy head (most recent)
37 this.tail = new DLLNode(0, 0); // Dummy tail (oldest)
38 this.head.next = this.tail;
39 this.tail.prev = this.head;
40 }
41
42 _remove(node) {
43 node.prev.next = node.next;
44 node.next.prev = node.prev;
45 }
46
47 _addToFront(node) {
48 node.next = this.head.next;
49 node.prev = this.head;
50 this.head.next.prev = node;
51 this.head.next = node;
52 }
53
54 get(key) {
55 if (!this.map.has(key)) return -1;
56 const node = this.map.get(key);
57 this._remove(node);
58 this._addToFront(node);
59 return node.val;
60 }
61
62 put(key, value) {
63 if (this.map.has(key)) {
64 this._remove(this.map.get(key));
65 }
66 const node = new DLLNode(key, value);
67 this._addToFront(node);
68 this.map.set(key, node);
69 if (this.map.size > this.cap) {
70 const lru = this.tail.prev;
71 this._remove(lru);
72 this.map.delete(lru.key);
73 }
74 }
75}
76
77// CACHE WITH TTL
78class TTLCache {
79 constructor() { this.cache = new Map(); }
80
81 set(key, value, ttlMs) {
82 const expiresAt = Date.now() + ttlMs;
83 this.cache.set(key, { value, expiresAt });
84 }
85
86 get(key) {
87 const entry = this.cache.get(key);
88 if (!entry) return undefined;
89 if (Date.now() > entry.expiresAt) {
90 this.cache.delete(key);
91 return undefined;
92 }
93 return entry.value;
94 }
95}

🏋️ Practice Exercise

Practice Problems & Design Questions:

  1. Implement LRU Cache with O(1) get and put
  2. Implement LFU Cache with O(1) get and put
  3. Add TTL (time-to-live) to your LRU cache
  4. Design a multi-level cache (L1 memory, L2 Redis, L3 database)
  5. How would you handle cache invalidation in a distributed system?
  6. Implement a write-through cache vs write-back cache
  7. Design a CDN caching strategy for static assets
  8. Calculate cache hit ratio given access patterns
  9. When would you choose LFU over LRU?
  10. Design a rate limiter using a sliding window cache

⚠️ Common Mistakes

  • Using only a hash map for LRU — need O(1) removal of oldest, which requires a linked list

  • Not handling put() for existing keys — must update AND move to most recent

  • Forgetting cache stampede — multiple threads see cache miss and all hit the database

  • Not considering cache invalidation strategy — stale data can cause serious bugs

  • Over-caching — caching rarely accessed data wastes memory; monitor hit ratios

💼 Interview Questions

🎤 Mock Interview

Practice a live interview for Real-World DSA: Caching Systems (LRU, LFU, TTL)