Database Migrations & Best Practices

📖 Concept

Database migrations are version-controlled changes to your database schema. They ensure every environment (development, staging, production) has the same database structure.

Why migrations matter:

  • Track schema changes in version control (Git)
  • Apply changes consistently across environments
  • Enable rollbacks when deployments fail
  • Document the evolution of your data model

Migration workflow:

1. Modify schema definition (Prisma) or create migration file
2. Generate migration: npx prisma migrate dev --name add_user_bio
3. Review the generated SQL
4. Apply to local database: automatic in dev, manual in production
5. Deploy: npx prisma migrate deploy (production)

Prisma migration commands:

Command Purpose Environment
prisma migrate dev Create + apply migration Development
prisma migrate deploy Apply pending migrations Production/CI
prisma migrate reset Reset database + apply all Development
prisma migrate status Show migration status Any

Best practices:

  1. Never edit applied migrations — create new ones that fix issues
  2. Small, focused migrations — one concern per migration
  3. Test rollbacks — can you undo this migration safely?
  4. Data migrations — separate from schema migrations; run as scripts
  5. Zero-downtime migrations — add nullable columns, backfill data, then add constraints
  6. Backup before production migrations — always have a rollback plan

🏠 Real-world analogy: Database migrations are like renovation blueprints for a building. Each blueprint (migration) documents a specific change. You apply them in order, and if something goes wrong, you have the old blueprints to revert. Never tear down a wall (drop a column) without a new blueprint.

💻 Code Example

codeTap to expand ⛶
1// Database Migrations & Best Practices
2
3// === Prisma Migration Workflow ===
4
5// Step 1: Modify schema.prisma
6// model User {
7// id Int @id @default(autoincrement())
8// email String @unique
9// name String
10// bio String? // ← NEW: Added bio field
11// role Role @default(USER) // ← NEW: Added role enum
12// }
13
14// Step 2: Create migration
15// $ npx prisma migrate dev --name add_user_bio_and_role
16// Creates: prisma/migrations/20240115120000_add_user_bio_and_role/migration.sql
17
18// === Generated SQL (for reference) ===
19// ALTER TABLE "users" ADD COLUMN "bio" TEXT;
20// CREATE TYPE "Role" AS ENUM ('USER', 'ADMIN', 'MODERATOR');
21// ALTER TABLE "users" ADD COLUMN "role" "Role" DEFAULT 'USER';
22
23// === Zero-Downtime Migration Strategy ===
24// Phase 1: Add column (nullable, no constraint change)
25// Phase 2: Backfill data in batches
26// Phase 3: Add constraint / set NOT NULL
27
28// Data migration script (run separately from schema migration)
29async function backfillUserRoles(prisma) {
30 const BATCH_SIZE = 1000;
31 let processed = 0;
32
33 while (true) {
34 const users = await prisma.user.findMany({
35 where: { role: null },
36 take: BATCH_SIZE,
37 select: { id: true },
38 });
39
40 if (users.length === 0) break;
41
42 await prisma.user.updateMany({
43 where: { id: { in: users.map((u) => u.id) } },
44 data: { role: "USER" },
45 });
46
47 processed += users.length;
48 console.log(`Backfilled ${processed} users`);
49 }
50
51 console.log(`Backfill complete: ${processed} total`);
52}
53
54// === Knex.js Migration Example (alternative to Prisma) ===
55
56// Migration file: 20240115_create_users.js
57const knexMigration = {
58 up: async function (knex) {
59 // Create table
60 await knex.schema.createTable("users", (table) => {
61 table.increments("id").primary();
62 table.string("email").notNullable().unique();
63 table.string("name").notNullable();
64 table.string("password").notNullable();
65 table.enum("role", ["user", "admin", "moderator"]).defaultTo("user");
66 table.text("bio");
67 table.boolean("is_active").defaultTo(true);
68 table.timestamps(true, true); // created_at, updated_at
69 });
70
71 // Add indexes
72 await knex.schema.alterTable("users", (table) => {
73 table.index(["email"]);
74 table.index(["is_active", "created_at"]);
75 });
76 },
77
78 down: async function (knex) {
79 await knex.schema.dropTableIfExists("users");
80 },
81};
82
83// Seed file: seed_users.js
84async function seedUsers(knex) {
85 await knex("users").del(); // Clear existing
86
87 await knex("users").insert([
88 { name: "Alice Admin", email: "alice@example.com", password: "hashed", role: "admin" },
89 { name: "Bob User", email: "bob@example.com", password: "hashed", role: "user" },
90 ]);
91}
92
93// === Connection String Patterns ===
94const connectionPatterns = {
95 // Local development
96 development: "postgresql://postgres:password@localhost:5432/myapp_dev",
97
98 // Docker Compose
99 docker: "postgresql://postgres:password@db:5432/myapp",
100
101 // Production (with SSL + connection pooling)
102 production: "postgresql://user:pass@host:5432/myapp?sslmode=require&connection_limit=10",
103
104 // MongoDB
105 mongodb: "mongodb://user:pass@host:27017/myapp?retryWrites=true&w=majority",
106
107 // Redis
108 redis: "redis://:password@host:6379/0",
109};
110
111module.exports = { backfillUserRoles, connectionPatterns };

🏋️ Practice Exercise

Exercises:

  1. Create a Prisma migration that adds a new table with relationships to existing tables
  2. Write a data migration script that backfills a new column in batches of 1000 records
  3. Implement a zero-downtime migration: add a nullable column, backfill, then set NOT NULL
  4. Create a seed script that populates the database with realistic test data
  5. Simulate a failed migration and practice rolling back safely
  6. Set up a migration pipeline: run migrations in CI before deploying the application

⚠️ Common Mistakes

  • Running destructive migrations (DROP COLUMN, DROP TABLE) without backup — always have a rollback plan and database backup

  • Editing already-applied migrations — this causes drift between environments; create new migrations to fix issues

  • Running large data migrations in a single transaction — this locks the table; batch updates in chunks

  • Not testing migrations with production-like data volumes — a migration that takes 2 seconds on 100 rows may take 2 hours on 10M rows

  • Skipping rolling back non-backward-compatible changes — if you rename a column, the old code can't work; deploy new code first, then migrate

💼 Interview Questions

🎤 Mock Interview

Mock interview is powered by AI for Database Migrations & Best Practices. Login to unlock this feature.