Redis & Caching Strategies

📖 Concept

Redis is an in-memory data structure store used as a cache, message broker, and session store. It's the most popular caching solution for Node.js applications.

Why Redis?

  • Sub-millisecond latency — data is in memory
  • Rich data structures — strings, hashes, lists, sets, sorted sets, streams
  • Persistence options — RDB snapshots, AOF (append-only file)
  • Pub/Sub — real-time messaging between services
  • TTL (Time-To-Live) — automatic key expiration

Common caching patterns:

Pattern Description Use Case
Cache-Aside App checks cache first, loads from DB on miss Most common, simple
Write-Through Write to cache AND DB simultaneously Strong consistency
Write-Behind Write to cache, async write to DB High write throughput
Read-Through Cache handles DB loading on miss Transparent to app

Cache invalidation strategies:

  • TTL-based — keys expire automatically (simplest)
  • Event-based — invalidate on write/update events
  • Version-based — include version in cache key, change version to invalidate

🏠 Real-world analogy: Redis is like a whiteboard next to your desk. When someone asks a question (query), you check the whiteboard first (cache). If the answer is there, instant response (cache hit). If not, you look it up in the filing cabinet (database), answer the question, and write it on the whiteboard for next time (cache set).

💻 Code Example

codeTap to expand ⛶
1// Redis & Caching — Production Patterns
2
3const Redis = require("ioredis");
4
5// 1. Redis connection with retry
6const redis = new Redis({
7 host: process.env.REDIS_HOST || "localhost",
8 port: parseInt(process.env.REDIS_PORT || "6379"),
9 password: process.env.REDIS_PASSWORD,
10 maxRetriesPerRequest: 3,
11 retryStrategy(times) {
12 const delay = Math.min(times * 50, 2000);
13 return delay;
14 },
15 lazyConnect: true,
16});
17
18redis.on("connect", () => console.log("Redis connected"));
19redis.on("error", (err) => console.error("Redis error:", err.message));
20
21// 2. Cache-Aside pattern (most common)
22async function getUserWithCache(userId) {
23 const cacheKey = `user:${userId}`;
24
25 // Check cache
26 const cached = await redis.get(cacheKey);
27 if (cached) {
28 console.log("Cache HIT");
29 return JSON.parse(cached);
30 }
31
32 // Cache miss — fetch from database
33 console.log("Cache MISS");
34 // const user = await db.user.findById(userId);
35 const user = { id: userId, name: "Alice", email: "alice@example.com" }; // Mock
36
37 // Store in cache with TTL (5 minutes)
38 await redis.set(cacheKey, JSON.stringify(user), "EX", 300);
39
40 return user;
41}
42
43// 3. Cache middleware for Express
44function cacheMiddleware(ttlSeconds = 300) {
45 return async (req, res, next) => {
46 // Only cache GET requests
47 if (req.method !== "GET") return next();
48
49 const cacheKey = `cache:${req.originalUrl}`;
50
51 try {
52 const cached = await redis.get(cacheKey);
53 if (cached) {
54 return res.json(JSON.parse(cached));
55 }
56 } catch (err) {
57 console.error("Cache read error:", err.message);
58 }
59
60 // Store original res.json to intercept
61 const originalJson = res.json.bind(res);
62 res.json = (data) => {
63 // Cache the response
64 redis.set(cacheKey, JSON.stringify(data), "EX", ttlSeconds).catch(console.error);
65 return originalJson(data);
66 };
67
68 next();
69 };
70}
71
72// 4. Cache invalidation on write
73async function updateUser(userId, data) {
74 // Update database
75 // const user = await db.user.update(userId, data);
76 const user = { id: userId, ...data };
77
78 // Invalidate cache
79 await redis.del(`user:${userId}`);
80
81 // Also invalidate list caches
82 const keys = await redis.keys("cache:/api/users*");
83 if (keys.length > 0) await redis.del(...keys);
84
85 return user;
86}
87
88// 5. Rate limiting with Redis
89async function checkRateLimit(clientId, maxRequests = 100, windowSeconds = 60) {
90 const key = `ratelimit:${clientId}`;
91 const current = await redis.incr(key);
92
93 if (current === 1) {
94 await redis.expire(key, windowSeconds);
95 }
96
97 return {
98 allowed: current <= maxRequests,
99 remaining: Math.max(0, maxRequests - current),
100 resetIn: await redis.ttl(key),
101 };
102}
103
104// 6. Session storage with Redis
105async function sessionExample() {
106 const sessionId = "sess_abc123";
107 const sessionData = {
108 userId: 1,
109 role: "admin",
110 loginTime: Date.now(),
111 };
112
113 // Store session (24 hour TTL)
114 await redis.set(`session:${sessionId}`, JSON.stringify(sessionData), "EX", 86400);
115
116 // Retrieve session
117 const session = JSON.parse(await redis.get(`session:${sessionId}`));
118
119 // Extend session on activity
120 await redis.expire(`session:${sessionId}`, 86400);
121
122 // Destroy session (logout)
123 await redis.del(`session:${sessionId}`);
124}
125
126// 7. Pub/Sub for real-time events
127async function pubSubExample() {
128 const subscriber = redis.duplicate();
129
130 subscriber.subscribe("notifications", (err) => {
131 if (err) console.error("Subscribe error:", err);
132 });
133
134 subscriber.on("message", (channel, message) => {
135 console.log(`[${channel}] ${message}`);
136 });
137
138 // Publish from another part of the app
139 await redis.publish("notifications", JSON.stringify({
140 type: "order_placed",
141 orderId: 123,
142 }));
143}
144
145module.exports = { redis, cacheMiddleware, checkRateLimit };

🏋️ Practice Exercise

Exercises:

  1. Implement Cache-Aside pattern for a REST API — cache GET responses, invalidate on POST/PUT/DELETE
  2. Build Express middleware that caches API responses in Redis with configurable TTL
  3. Implement rate limiting with Redis — sliding window counter per IP address
  4. Create a session management system using Redis with automatic expiration
  5. Build a leaderboard using Redis sorted sets (ZADD, ZRANGE, ZRANK)
  6. Implement Pub/Sub for a real-time notification system across multiple server instances

⚠️ Common Mistakes

  • Caching everything without a TTL — stale data accumulates and Redis runs out of memory; always set expiration

  • Not handling Redis connection failures gracefully — the app should work (slower) even if Redis is down; wrap cache operations in try/catch

  • Using KEYS command in production — it blocks Redis and scans all keys; use SCAN for production-safe iteration

  • Storing large objects in Redis — Redis is for fast, small data; store large blobs in the filesystem or object storage

  • Not using connection pooling — create one Redis client instance and share it across the app; don't create one per request

💼 Interview Questions

🎤 Mock Interview

Mock interview is powered by AI for Redis & Caching Strategies. Login to unlock this feature.