Caching Fundamentals & Patterns

0/4 in this phase0/45 across the roadmap

📖 Concept

Caching stores frequently accessed data in a fast storage layer (typically in-memory) so future requests are served faster.

Why Cache?

  • Without cache: every request hits the database (50-500ms)
  • With cache: hot data served from memory (1-5ms)
  • Database load reduced by 80-90%

Cache Patterns

1. Cache-Aside (Lazy Loading) — Most Common

Read → Check cache → Miss → Query DB → Store in cache → Return Pros: Only caches requested data. Cons: First request slow.

2. Write-Through

Write → Update cache + Update DB simultaneously Pros: Cache always consistent. Cons: Higher write latency.

3. Write-Behind (Write-Back)

Write → Update cache → Return → (async) Write to DB Pros: Very fast writes. Cons: Data loss risk if cache crashes.

4. Read-Through

Cache sits between app and DB. On miss, cache loads from DB itself.

Cache Eviction Policies

Policy Description Best For
LRU Evict least recently used General purpose
LFU Evict least frequently used Clear hot/cold data
TTL Evict after time period Known staleness window
FIFO Evict oldest Uniform access

Rule of thumb: Use LRU + TTL in production — handles 95% of use cases.

💻 Code Example

codeTap to expand ⛶
1// ============================================
2// Caching Patterns — Implementation
3// ============================================
4
5// ---------- Cache-Aside Pattern ----------
6class CacheAsideService {
7 constructor(cache, database) {
8 this.cache = cache;
9 this.database = database;
10 this.defaultTTL = 3600;
11 }
12
13 async getUser(userId) {
14 const cacheKey = `user:\${userId}`;
15 const cached = await this.cache.get(cacheKey);
16 if (cached) {
17 console.log(`Cache HIT for \${cacheKey}`);
18 return JSON.parse(cached);
19 }
20
21 console.log(`Cache MISS for \${cacheKey}`);
22 const user = await this.database.query('SELECT * FROM users WHERE id = $1', [userId]);
23 if (!user) return null;
24
25 await this.cache.set(cacheKey, JSON.stringify(user), 'EX', this.defaultTTL);
26 return user;
27 }
28
29 async updateUser(userId, data) {
30 await this.database.query('UPDATE users SET name = $1 WHERE id = $2', [data.name, userId]);
31 await this.cache.del(`user:\${userId}`);
32 }
33}
34
35// ---------- Write-Behind (Async DB Write) ----------
36class WriteBehindService {
37 constructor(cache, queue) {
38 this.cache = cache;
39 this.queue = queue;
40 }
41
42 async incrementViewCount(postId) {
43 const newCount = await this.cache.incr(`views:\${postId}`);
44 if (newCount % 100 === 0) {
45 await this.queue.publish('db.write', { table: 'posts', set: { view_count: newCount }, where: { id: postId } });
46 }
47 return newCount;
48 }
49}
50
51// ---------- LRU Cache Implementation ----------
52class LRUCache {
53 constructor(capacity) {
54 this.capacity = capacity;
55 this.cache = new Map();
56 }
57
58 get(key) {
59 if (!this.cache.has(key)) return null;
60 const value = this.cache.get(key);
61 this.cache.delete(key);
62 this.cache.set(key, value);
63 return value;
64 }
65
66 put(key, value) {
67 if (this.cache.has(key)) this.cache.delete(key);
68 if (this.cache.size >= this.capacity) {
69 const lruKey = this.cache.keys().next().value;
70 this.cache.delete(lruKey);
71 }
72 this.cache.set(key, value);
73 }
74}
75
76// ---------- Multi-Layer Cache ----------
77class MultiLayerCache {
78 constructor() {
79 this.l1 = new LRUCache(100); // In-process (~0.001ms)
80 this.l2 = null; // Redis (~0.5-1ms)
81 this.l3 = null; // Database (~5-50ms)
82 }
83
84 async get(key) {
85 let value = this.l1.get(key);
86 if (value) return value;
87
88 value = await this.l2.get(key);
89 if (value) { this.l1.put(key, value); return JSON.parse(value); }
90
91 value = await this.l3.query('SELECT * FROM data WHERE key = $1', [key]);
92 if (value) {
93 await this.l2.set(key, JSON.stringify(value), 'EX', 3600);
94 this.l1.put(key, value);
95 }
96 return value;
97 }
98}
99
100const lru = new LRUCache(3);
101lru.put('a', 1); lru.put('b', 2); lru.put('c', 3);
102lru.get('a'); lru.put('d', 4);
103console.log('b evicted:', lru.get('b')); // null

🏋️ Practice Exercise

  1. Pattern Selection: Choose the best caching pattern for: (a) User profile page, (b) Page view counter, (c) Shopping cart, (d) Stock price ticker. Justify each.

  2. LRU Implementation: Implement LRU cache with O(1) get/put using HashMap + Doubly Linked List.

  3. Cache Size Calculation: 10M DAU, 20 posts/page at 2KB each, 5 pages/day. How much Redis memory using 80/20 rule?

  4. Multi-Layer Cache: Design 3-layer caching (browser, CDN, Redis) for e-commerce. Define TTLs and invalidation per layer.

  5. Cache Warming: Design a strategy to pre-populate cache with hot data after application restart.

⚠️ Common Mistakes

  • Cache stampede — popular key expires, hundreds of requests hit DB simultaneously. Use mutex locks or stale-while-revalidate.

  • Caching without TTL — data without expiration becomes stale indefinitely. Always set a TTL.

  • Not monitoring cache hit rate — below 80% indicates cache is too small or access patterns aren't cache-friendly.

  • Using cache as primary data store — if Redis crashes, data is lost. Database is the source of truth.

💼 Interview Questions

🎤 Mock Interview

Practice a live interview for Caching Fundamentals & Patterns