Caching Strategies for Android

0/5 in this phase0/52 across the roadmap

📖 Concept

Caching is one of the most impactful optimizations in mobile apps. A well-designed cache strategy reduces network calls, improves perceived performance, and enables offline functionality.

Caching layers in a typical Android app:

1. In-memory cache (fastest, volatile)
   └── LruCache, HashMap, Compose remember
2. Disk cache (persistent, slower)
   └── Room database, DataStore, OkHttp cache, DiskLruCache
3. Network cache (HTTP caching)
   └── Cache-Control headers, ETag, Last-Modified

Cache invalidation strategies:

  1. Time-based (TTL): Data expires after a fixed duration. Simple but may serve stale data.
  2. Event-based: Server pushes invalidation events. Real-time but requires infrastructure.
  3. Version-based (ETag): Server returns ETag, client sends If-None-Match. 304 = use cache.
  4. Write-through: Write to cache AND source simultaneously. Always consistent.
  5. Write-behind: Write to cache first, batch-sync to source later. Fast writes but risk data loss.

The "stale-while-revalidate" pattern:

  1. Return cached data immediately (fast UX)
  2. Fetch fresh data from network in background
  3. Update cache and notify observers (Flow/LiveData emits new data)
  4. UI updates seamlessly

This is the most common pattern in production Android apps.

💻 Code Example

codeTap to expand ⛶
1// Multi-layer caching implementation
2
3// In-memory LRU cache
4class InMemoryCache<K, V>(maxSize: Int) {
5 private val cache = object : LruCache<K, V>(maxSize) {
6 override fun sizeOf(key: K, value: V): Int = 1
7 }
8
9 fun get(key: K): V? = cache.get(key)
10 fun put(key: K, value: V) = cache.put(key, value)
11 fun evict(key: K) = cache.remove(key)
12 fun clear() = cache.evictAll()
13}
14
15// Repository with stale-while-revalidate pattern
16class ArticleRepository @Inject constructor(
17 private val api: ArticleApi,
18 private val dao: ArticleDao,
19 private val memoryCache: InMemoryCache<String, List<Article>>
20) {
21 fun getArticles(): Flow<List<Article>> = flow {
22 // Layer 1: Memory cache (instant)
23 memoryCache.get("articles")?.let { emit(it) }
24
25 // Layer 2: Database (fast, persistent)
26 val dbArticles = dao.getAll().first().map { it.toDomain() }
27 if (dbArticles.isNotEmpty()) {
28 emit(dbArticles)
29 memoryCache.put("articles", dbArticles)
30 }
31
32 // Layer 3: Network (fresh data)
33 try {
34 val networkArticles = api.getArticles()
35 val entities = networkArticles.map { it.toEntity() }
36 dao.upsertAll(entities)
37 val domainArticles = entities.map { it.toDomain() }
38 memoryCache.put("articles", domainArticles)
39 emit(domainArticles)
40 } catch (e: Exception) {
41 if (dbArticles.isEmpty()) throw e
42 // Stale data is better than no data
43 }
44 }.distinctUntilChanged()
45
46 // Cache-aware single item fetch
47 suspend fun getArticle(id: String): Article {
48 // Check memory cache first
49 memoryCache.get("article_$id")?.let { return it.first() }
50
51 // Then database
52 val entity = dao.getById(id)
53 if (entity != null && !entity.isStale()) {
54 return entity.toDomain()
55 }
56
57 // Finally network
58 val article = api.getArticle(id)
59 dao.upsert(article.toEntity())
60 return article.toDomain()
61 }
62}
63
64// Image caching with Coil (built-in multi-layer cache)
65@Composable
66fun CachedImage(url: String, modifier: Modifier = Modifier) {
67 AsyncImage(
68 model = ImageRequest.Builder(LocalContext.current)
69 .data(url)
70 .memoryCacheKey(url) // In-memory LRU
71 .diskCacheKey(url) // Disk LRU
72 .crossfade(true)
73 .placeholder(R.drawable.placeholder)
74 .error(R.drawable.error)
75 .build(),
76 contentDescription = null,
77 modifier = modifier
78 )
79}

🏋️ Practice Exercise

Practice:

  1. Implement a 3-layer cache (memory → disk → network) for an API endpoint
  2. Add TTL-based cache invalidation to your Room entities
  3. Configure OkHttp HTTP caching with Cache-Control headers
  4. Implement the stale-while-revalidate pattern with Flow
  5. Design a cache strategy for an image gallery that handles 10,000+ images

⚠️ Common Mistakes

  • Not caching at all — every screen load hits the network, wasting bandwidth and battery

  • Caching without a size limit — unbounded caches consume all available memory/disk

  • Serving infinitely stale data — cache must have expiration, either time-based or event-based

  • Using in-memory cache without a disk fallback — data lost on process death

  • Not cache-busting on user-triggered refresh — pull-to-refresh should bypass cache

💼 Interview Questions

🎤 Mock Interview

Mock interview is powered by AI for Caching Strategies for Android. Login to unlock this feature.