Skip to content
Learn Agentic AI15 min read0 views

Building an MCP Server in Python: Exposing Database Tools to AI Agents

Build a production-ready MCP server in Python using FastMCP that exposes database query and mutation tools to any AI agent, complete with input validation, error handling, and async database access.

From Concept to Running Server

The fastest way to build an MCP server in Python is with the mcp package and its FastMCP class. FastMCP handles all the JSON-RPC plumbing, transport setup, and schema generation automatically. You define Python functions, decorate them as tools, and FastMCP exposes them over the MCP protocol.

In this tutorial, we will build an MCP server that gives AI agents the ability to query a SQLite database, insert records, and list tables — a practical foundation you can extend to PostgreSQL, MySQL, or any other database.

Setting Up the Project

Install the dependencies:

# requirements.txt
mcp>=1.0.0
aiosqlite>=0.20.0

Create the server file:

# db_server.py
from mcp.server.fastmcp import FastMCP
import aiosqlite
import json

DATABASE_PATH = "app.db"

mcp_server = FastMCP(
    name="DatabaseServer",
    instructions="Query and manage the application database. "
    "Use list_tables to discover schema, query_db for reads, "
    "and insert_record for writes.",
)

The instructions parameter tells connected AI agents how to use this server. It appears in the server metadata during capability negotiation.

Defining Database Tools

Each tool is an async Python function decorated with @mcp_server.tool(). FastMCP inspects the function signature and docstring to auto-generate the JSON schema that agents use to understand the tool:

@mcp_server.tool()
async def list_tables() -> str:
    """List all tables in the database with their column names and types."""
    async with aiosqlite.connect(DATABASE_PATH) as db:
        cursor = await db.execute(
            "SELECT name FROM sqlite_master WHERE type='table' ORDER BY name"
        )
        tables = await cursor.fetchall()

        result = {}
        for (table_name,) in tables:
            col_cursor = await db.execute(
                f"PRAGMA table_info({table_name})"
            )
            columns = await col_cursor.fetchall()
            result[table_name] = [
                {"name": col[1], "type": col[2], "nullable": not col[3]}
                for col in columns
            ]

        return json.dumps(result, indent=2)


@mcp_server.tool()
async def query_db(sql: str, params: list[str] | None = None) -> str:
    """Execute a read-only SQL query and return results as JSON.

    Args:
        sql: A SELECT SQL query. Only read operations are allowed.
        params: Optional list of query parameters for parameterized queries.
    """
    normalized = sql.strip().upper()
    if not normalized.startswith("SELECT"):
        return json.dumps({"error": "Only SELECT queries are allowed"})

    async with aiosqlite.connect(DATABASE_PATH) as db:
        db.row_factory = aiosqlite.Row
        try:
            cursor = await db.execute(sql, params or [])
            rows = await cursor.fetchall()
            columns = [desc[0] for desc in cursor.description]
            results = [dict(zip(columns, row)) for row in rows]
            return json.dumps({
                "columns": columns,
                "row_count": len(results),
                "rows": results[:100],  # Cap at 100 rows
            }, indent=2, default=str)
        except Exception as e:
            return json.dumps({"error": str(e)})

Notice the query_db tool validates that only SELECT statements are allowed. This is a critical safety measure — you do not want an AI agent running DROP TABLE through a read tool.

See AI Voice Agents Handle Real Calls

Book a free demo or calculate how much you can save with AI voice automation.

Adding Write Operations with Validation

For mutations, create a separate tool with explicit parameter validation:

@mcp_server.tool()
async def insert_record(
    table: str,
    data: dict[str, str | int | float | None],
) -> str:
    """Insert a single record into a table.

    Args:
        table: The table name to insert into.
        data: A dictionary of column names to values.
    """
    # Validate table name to prevent injection
    if not table.isidentifier():
        return json.dumps({"error": "Invalid table name"})

    columns = list(data.keys())
    placeholders = ", ".join(["?"] * len(columns))
    col_names = ", ".join(columns)
    values = list(data.values())

    async with aiosqlite.connect(DATABASE_PATH) as db:
        try:
            cursor = await db.execute(
                f"INSERT INTO {table} ({col_names}) VALUES ({placeholders})",
                values,
            )
            await db.commit()
            return json.dumps({
                "success": True,
                "rowid": cursor.lastrowid,
            })
        except Exception as e:
            return json.dumps({"error": str(e)})

Running the Server

FastMCP servers can run over stdio (for local agent integrations) or HTTP (for remote access):

# Run over stdio (default for local tools)
if __name__ == "__main__":
    mcp_server.run(transport="stdio")

For HTTP transport, switch to streamable HTTP:

if __name__ == "__main__":
    mcp_server.run(transport="streamable-http", host="0.0.0.0", port=8001)

Connecting from an AI Agent

With the OpenAI Agents SDK, connecting to this server takes two lines:

from agents.mcp import MCPServerStdio

db_server = MCPServerStdio(
    name="Database",
    params={
        "command": "python",
        "args": ["db_server.py"],
    },
    cache_tools_list=True,
)

The agent can now call list_tables, query_db, and insert_record as naturally as calling any other function. The MCP protocol handles all serialization, validation, and transport.

Production Hardening Checklist

Before deploying a database MCP server to production, ensure you address these concerns. First, use a connection pool instead of opening a new connection per tool call. Second, add query timeouts to prevent long-running queries from blocking the server. Third, implement row-level security or restrict which tables the agent can access. Fourth, log every tool invocation with the query text and caller identity for auditing.

FAQ

Can I use PostgreSQL or MySQL instead of SQLite?

Absolutely. Replace aiosqlite with asyncpg for PostgreSQL or aiomysql for MySQL. The tool function signatures stay the same — only the internal connection and query logic changes. The MCP protocol does not care what database backs your tools.

How does FastMCP generate the tool schema?

FastMCP inspects the function's type annotations and docstring. The parameter names, types, and descriptions from the docstring become the JSON Schema that agents see during tools/list. If you use Annotated types with Field metadata, FastMCP includes those constraints (min, max, pattern) in the schema.

What happens if the agent sends invalid parameters?

FastMCP validates the incoming parameters against the generated JSON Schema before calling your function. If the parameters are invalid, it returns a JSON-RPC error response with code -32602 (Invalid params) — your function never executes. This is one of the key safety benefits of the MCP protocol.


#MCP #Python #FastMCP #Database #AIAgents #AgenticAI #LearnAI #AIEngineering

Share this article
C

CallSphere Team

Expert insights on AI voice agents and customer communication automation.

Try CallSphere AI Voice Agents

See how AI voice agents work for your industry. Live demo available -- no signup required.