Building a Database Migration Agent: AI-Powered Schema Evolution
Learn to build an AI agent that generates safe database migrations from natural language requirements. Covers schema analysis, migration generation, safety checks, rollback planning, and testing strategies.
Why Database Migrations Are Dangerous
Database migrations are among the riskiest operations in software development. A bad migration can cause data loss, extended downtime, or cascading failures across services. Unlike code deployments, you cannot simply roll back a database change if data has already been transformed.
An AI-powered migration agent reduces this risk by analyzing the current schema, generating safe migration SQL, producing rollback scripts, and validating everything against a test database before it touches production.
The Migration Agent Architecture
The agent takes a natural language description of the desired schema change and the current database schema, then produces a complete migration package: the forward migration, a rollback script, and a test plan.
from dataclasses import dataclass
from openai import OpenAI
import psycopg2
client = OpenAI()
@dataclass
class MigrationPlan:
description: str
up_sql: str
down_sql: str
is_destructive: bool
estimated_lock_time: str
warnings: list[str]
class DatabaseMigrationAgent:
def __init__(self, connection_string: str, model: str = "gpt-4o"):
self.connection_string = connection_string
self.model = model
def get_current_schema(self) -> str:
conn = psycopg2.connect(self.connection_string)
cursor = conn.cursor()
cursor.execute("""
SELECT table_name, column_name, data_type,
is_nullable, column_default
FROM information_schema.columns
WHERE table_schema = 'public'
ORDER BY table_name, ordinal_position
""")
rows = cursor.fetchall()
conn.close()
schema_text = ""
current_table = ""
for table, col, dtype, nullable, default in rows:
if table != current_table:
schema_text += f"\nTABLE {table}:\n"
current_table = table
null_str = "NULL" if nullable == "YES" else "NOT NULL"
default_str = f" DEFAULT {default}" if default else ""
schema_text += f" {col} {dtype} {null_str}{default_str}\n"
return schema_text
Generating Safe Migrations
The core generation step includes safety constraints that prevent common migration pitfalls like dropping columns with data or adding NOT NULL columns without defaults.
def generate_migration(self, requirement: str) -> MigrationPlan:
schema = self.get_current_schema()
system_prompt = """You are a senior database engineer. Generate a
PostgreSQL migration based on the requirement and current schema.
SAFETY RULES:
- NEVER drop a column or table without explicit confirmation
- Adding NOT NULL columns MUST include a DEFAULT value
- Large table alterations should use CREATE INDEX CONCURRENTLY
- Prefer ADD COLUMN over recreating tables
- Include appropriate locks and transaction handling
Return a JSON object with these fields:
- "description": what the migration does
- "up_sql": the forward migration SQL
- "down_sql": the rollback SQL
- "is_destructive": boolean
- "estimated_lock_time": human-readable estimate
- "warnings": list of risk factors
Output ONLY valid JSON."""
response = client.chat.completions.create(
model=self.model,
messages=[
{"role": "system", "content": system_prompt},
{"role": "user", "content": (
f"Current schema:\n{schema}\n\n"
f"Requirement: {requirement}"
)},
],
temperature=0,
response_format={"type": "json_object"},
)
import json
data = json.loads(response.choices[0].message.content)
return MigrationPlan(**data)
Safety Validation
Before executing any migration, the agent runs it against a test database and checks for common problems.
See AI Voice Agents Handle Real Calls
Book a free demo or calculate how much you can save with AI voice automation.
def validate_migration(self, plan: MigrationPlan) -> dict:
issues = []
up_sql_lower = plan.up_sql.lower()
if "drop table" in up_sql_lower:
issues.append("CRITICAL: Migration drops a table")
if "drop column" in up_sql_lower:
issues.append("WARNING: Migration drops a column")
if "not null" in up_sql_lower and "default" not in up_sql_lower:
issues.append("CRITICAL: NOT NULL column without DEFAULT")
if "alter type" in up_sql_lower:
issues.append("WARNING: Column type change may lock table")
test_result = self._test_on_clone(plan)
if not test_result["success"]:
issues.append(f"EXECUTION ERROR: {test_result['error']}")
return {
"valid": len([i for i in issues if "CRITICAL" in i]) == 0,
"issues": issues,
"test_result": test_result,
}
def _test_on_clone(self, plan: MigrationPlan) -> dict:
try:
conn = psycopg2.connect(self.connection_string)
conn.autocommit = False
cursor = conn.cursor()
cursor.execute(plan.up_sql)
cursor.execute(plan.down_sql)
conn.rollback()
conn.close()
return {"success": True, "error": None}
except Exception as e:
return {"success": False, "error": str(e)}
The _test_on_clone method runs both the forward and rollback migration inside a transaction that is always rolled back. This verifies that both scripts execute without errors and that the rollback actually reverses the forward migration.
Putting It All Together
agent = DatabaseMigrationAgent("postgresql://user:pass@localhost/mydb")
plan = agent.generate_migration(
"Add a tags column to the posts table as a text array, "
"and create a GIN index for fast tag searches"
)
validation = agent.validate_migration(plan)
print(f"Destructive: {plan.is_destructive}")
print(f"Lock time: {plan.estimated_lock_time}")
print(f"Valid: {validation['valid']}")
for issue in validation["issues"]:
print(f" - {issue}")
print(f"\nUP:\n{plan.up_sql}")
print(f"\nDOWN:\n{plan.down_sql}")
FAQ
How does the agent handle migrations on tables with millions of rows?
The agent is prompted to use non-blocking operations like CREATE INDEX CONCURRENTLY and to avoid ALTER TABLE ... ADD COLUMN ... NOT NULL without defaults on large tables. For data backfills, it generates batched update scripts instead of single statements that would lock the entire table.
Should I trust AI-generated migrations in production?
Never run AI-generated migrations directly in production without human review. Use the agent to generate a first draft and validate it automatically, then have a senior engineer review the output before applying. The agent eliminates the blank-page problem and catches obvious mistakes, but human judgment is still essential.
Can this work with ORMs like SQLAlchemy or Prisma?
Yes. Instead of generating raw SQL, you can prompt the agent to generate Alembic migration files for SQLAlchemy or Prisma schema changes. Feed it the current ORM schema definition instead of raw SQL metadata. The validation step would then run the ORM migration tool in dry-run mode.
#DatabaseMigrations #AIAgents #Python #PostgreSQL #SchemaManagement #AgenticAI #LearnAI #AIEngineering
CallSphere Team
Expert insights on AI voice agents and customer communication automation.
Try CallSphere AI Voice Agents
See how AI voice agents work for your industry. Live demo available -- no signup required.