Database Migrations & Best Practices
📖 Concept
Database migrations are version-controlled changes to your database schema. They ensure every environment (development, staging, production) has the same database structure.
Why migrations matter:
- Track schema changes in version control (Git)
- Apply changes consistently across environments
- Enable rollbacks when deployments fail
- Document the evolution of your data model
Migration workflow:
1. Modify schema definition (Prisma) or create migration file
2. Generate migration: npx prisma migrate dev --name add_user_bio
3. Review the generated SQL
4. Apply to local database: automatic in dev, manual in production
5. Deploy: npx prisma migrate deploy (production)
Prisma migration commands:
| Command | Purpose | Environment |
|---|---|---|
prisma migrate dev |
Create + apply migration | Development |
prisma migrate deploy |
Apply pending migrations | Production/CI |
prisma migrate reset |
Reset database + apply all | Development |
prisma migrate status |
Show migration status | Any |
Best practices:
- Never edit applied migrations — create new ones that fix issues
- Small, focused migrations — one concern per migration
- Test rollbacks — can you undo this migration safely?
- Data migrations — separate from schema migrations; run as scripts
- Zero-downtime migrations — add nullable columns, backfill data, then add constraints
- Backup before production migrations — always have a rollback plan
🏠 Real-world analogy: Database migrations are like renovation blueprints for a building. Each blueprint (migration) documents a specific change. You apply them in order, and if something goes wrong, you have the old blueprints to revert. Never tear down a wall (drop a column) without a new blueprint.
💻 Code Example
1// Database Migrations & Best Practices23// === Prisma Migration Workflow ===45// Step 1: Modify schema.prisma6// model User {7// id Int @id @default(autoincrement())8// email String @unique9// name String10// bio String? // ← NEW: Added bio field11// role Role @default(USER) // ← NEW: Added role enum12// }1314// Step 2: Create migration15// $ npx prisma migrate dev --name add_user_bio_and_role16// Creates: prisma/migrations/20240115120000_add_user_bio_and_role/migration.sql1718// === Generated SQL (for reference) ===19// ALTER TABLE "users" ADD COLUMN "bio" TEXT;20// CREATE TYPE "Role" AS ENUM ('USER', 'ADMIN', 'MODERATOR');21// ALTER TABLE "users" ADD COLUMN "role" "Role" DEFAULT 'USER';2223// === Zero-Downtime Migration Strategy ===24// Phase 1: Add column (nullable, no constraint change)25// Phase 2: Backfill data in batches26// Phase 3: Add constraint / set NOT NULL2728// Data migration script (run separately from schema migration)29async function backfillUserRoles(prisma) {30 const BATCH_SIZE = 1000;31 let processed = 0;3233 while (true) {34 const users = await prisma.user.findMany({35 where: { role: null },36 take: BATCH_SIZE,37 select: { id: true },38 });3940 if (users.length === 0) break;4142 await prisma.user.updateMany({43 where: { id: { in: users.map((u) => u.id) } },44 data: { role: "USER" },45 });4647 processed += users.length;48 console.log(`Backfilled ${processed} users`);49 }5051 console.log(`Backfill complete: ${processed} total`);52}5354// === Knex.js Migration Example (alternative to Prisma) ===5556// Migration file: 20240115_create_users.js57const knexMigration = {58 up: async function (knex) {59 // Create table60 await knex.schema.createTable("users", (table) => {61 table.increments("id").primary();62 table.string("email").notNullable().unique();63 table.string("name").notNullable();64 table.string("password").notNullable();65 table.enum("role", ["user", "admin", "moderator"]).defaultTo("user");66 table.text("bio");67 table.boolean("is_active").defaultTo(true);68 table.timestamps(true, true); // created_at, updated_at69 });7071 // Add indexes72 await knex.schema.alterTable("users", (table) => {73 table.index(["email"]);74 table.index(["is_active", "created_at"]);75 });76 },7778 down: async function (knex) {79 await knex.schema.dropTableIfExists("users");80 },81};8283// Seed file: seed_users.js84async function seedUsers(knex) {85 await knex("users").del(); // Clear existing8687 await knex("users").insert([88 { name: "Alice Admin", email: "alice@example.com", password: "hashed", role: "admin" },89 { name: "Bob User", email: "bob@example.com", password: "hashed", role: "user" },90 ]);91}9293// === Connection String Patterns ===94const connectionPatterns = {95 // Local development96 development: "postgresql://postgres:password@localhost:5432/myapp_dev",9798 // Docker Compose99 docker: "postgresql://postgres:password@db:5432/myapp",100101 // Production (with SSL + connection pooling)102 production: "postgresql://user:pass@host:5432/myapp?sslmode=require&connection_limit=10",103104 // MongoDB105 mongodb: "mongodb://user:pass@host:27017/myapp?retryWrites=true&w=majority",106107 // Redis108 redis: "redis://:password@host:6379/0",109};110111module.exports = { backfillUserRoles, connectionPatterns };
🏋️ Practice Exercise
Exercises:
- Create a Prisma migration that adds a new table with relationships to existing tables
- Write a data migration script that backfills a new column in batches of 1000 records
- Implement a zero-downtime migration: add a nullable column, backfill, then set NOT NULL
- Create a seed script that populates the database with realistic test data
- Simulate a failed migration and practice rolling back safely
- Set up a migration pipeline: run migrations in CI before deploying the application
⚠️ Common Mistakes
Running destructive migrations (DROP COLUMN, DROP TABLE) without backup — always have a rollback plan and database backup
Editing already-applied migrations — this causes drift between environments; create new migrations to fix issues
Running large data migrations in a single transaction — this locks the table; batch updates in chunks
Not testing migrations with production-like data volumes — a migration that takes 2 seconds on 100 rows may take 2 hours on 10M rows
Skipping rolling back non-backward-compatible changes — if you rename a column, the old code can't work; deploy new code first, then migrate
💼 Interview Questions
🎤 Mock Interview
Mock interview is powered by AI for Database Migrations & Best Practices. Login to unlock this feature.