Skip to content
Learn Agentic AI11 min read0 views

Building an Agent with Mastra Framework: TypeScript-First Agent Development

Learn how to build AI agents using the Mastra framework. This guide covers project setup, agent definition with typed tools, persistent memory, workflow orchestration, and deployment strategies for TypeScript-first agent applications.

What Is Mastra

Mastra is an open-source TypeScript framework designed specifically for building AI agents, workflows, and RAG pipelines. Unlike general-purpose libraries that bolt agent capabilities onto existing chat abstractions, Mastra treats agents as first-class primitives with built-in support for tools, memory, structured outputs, and multi-step workflows.

The framework follows a "TypeScript-first" philosophy — every component is fully typed, schemas are defined with Zod, and the developer experience prioritizes IDE autocompletion and compile-time safety.

Project Setup

Scaffold a new Mastra project using the CLI:

npx create-mastra@latest my-agent-app
cd my-agent-app

The CLI prompts you for your preferred LLM provider and generates a project structure:

my-agent-app/
  src/
    mastra/
      agents/
        index.ts       # Agent definitions
      tools/
        index.ts       # Tool definitions
      index.ts         # Mastra instance
  .env
  package.json

Install dependencies and set your API key:

npm install
echo "OPENAI_API_KEY=sk-proj-your-key" > .env

Defining Tools

Tools give your agent capabilities beyond text generation. Each tool has a typed input schema, a description for the LLM, and an execute function:

// src/mastra/tools/index.ts
import { createTool } from "@mastra/core";
import { z } from "zod";

export const searchDocsTool = createTool({
  id: "search_docs",
  description: "Search the documentation for relevant articles",
  inputSchema: z.object({
    query: z.string().describe("The search query"),
    limit: z.number().default(5).describe("Max results to return"),
  }),
  outputSchema: z.object({
    results: z.array(
      z.object({
        title: z.string(),
        snippet: z.string(),
        url: z.string(),
      })
    ),
  }),
  execute: async ({ context }) => {
    const { query, limit } = context;
    const results = await searchKnowledgeBase(query, limit);
    return { results };
  },
});

export const createTicketTool = createTool({
  id: "create_support_ticket",
  description: "Create a support ticket for unresolved issues",
  inputSchema: z.object({
    title: z.string(),
    description: z.string(),
    priority: z.enum(["low", "medium", "high"]),
  }),
  execute: async ({ context }) => {
    const ticket = await ticketService.create(context);
    return { ticketId: ticket.id, status: "created" };
  },
});

The inputSchema serves dual purpose: it generates the JSON Schema sent to the LLM for function calling and it validates the arguments at runtime before execute runs.

Defining an Agent

Agents combine a model, system instructions, and tools into a coherent unit:

See AI Voice Agents Handle Real Calls

Book a free demo or calculate how much you can save with AI voice automation.

// src/mastra/agents/index.ts
import { Agent } from "@mastra/core";
import { searchDocsTool, createTicketTool } from "../tools";

export const supportAgent = new Agent({
  name: "Support Agent",
  instructions: `You are a customer support agent for a SaaS platform.
Your primary task is to answer user questions by searching documentation.
If you cannot resolve an issue after searching, create a support ticket.
Always be concise and reference specific documentation links.`,
  model: {
    provider: "OPEN_AI",
    name: "gpt-4o",
    toolChoice: "auto",
  },
  tools: {
    search_docs: searchDocsTool,
    create_support_ticket: createTicketTool,
  },
});

Registering with the Mastra Instance

The Mastra instance is the central registry for all agents, tools, and workflows:

// src/mastra/index.ts
import { Mastra } from "@mastra/core";
import { supportAgent } from "./agents";

export const mastra = new Mastra({
  agents: { supportAgent },
});

Running the Agent

Execute the agent programmatically or through the built-in dev server:

import { mastra } from "./mastra";

async function main() {
  const agent = mastra.getAgent("supportAgent");

  const response = await agent.generate(
    "How do I reset my password? I've tried the forgot password link but it's not sending emails."
  );

  console.log(response.text);
}

main();

For development, Mastra provides a playground:

npx mastra dev

This launches a local web interface where you can interact with your agents, inspect tool calls, and debug conversation flows.

Adding Memory

Mastra supports persistent memory so agents remember context across conversations:

import { Agent } from "@mastra/core";
import { PostgresMemory } from "@mastra/memory";

const memory = new PostgresMemory({
  connectionString: process.env.DATABASE_URL!,
});

export const supportAgent = new Agent({
  name: "Support Agent",
  instructions: "...",
  model: { provider: "OPEN_AI", name: "gpt-4o" },
  tools: { /* ... */ },
  memory,
});

With memory enabled, calling agent.generate() with a threadId parameter automatically loads and saves conversation history.

Workflows for Multi-Step Processes

For complex operations that go beyond a single agent loop, Mastra provides typed workflows:

import { Workflow, Step } from "@mastra/core";
import { z } from "zod";

const onboardingWorkflow = new Workflow({
  name: "user-onboarding",
  triggerSchema: z.object({
    userId: z.string(),
    plan: z.enum(["free", "pro", "enterprise"]),
  }),
});

onboardingWorkflow
  .step(new Step({
    id: "create-workspace",
    execute: async ({ context }) => {
      return { workspaceId: await createWorkspace(context.userId) };
    },
  }))
  .then(new Step({
    id: "send-welcome",
    execute: async ({ context }) => {
      await sendWelcomeEmail(context.userId, context.workspaceId);
      return { emailSent: true };
    },
  }));

FAQ

How does Mastra compare to LangChain.js?

Mastra is more opinionated and TypeScript-native. LangChain.js offers broader integrations and a larger community, but Mastra provides tighter type safety, a built-in dev playground, and a cleaner API surface. Mastra is a good choice if you want a batteries-included framework specifically for agent applications rather than a general-purpose LLM toolkit.

Can I use Mastra with providers other than OpenAI?

Yes. Mastra supports Anthropic, Google Gemini, and Groq out of the box. Specify the provider in the agent's model configuration. The tool calling interface remains identical regardless of the underlying model provider.

Is Mastra suitable for production deployments?

Mastra is designed for production use. It supports deployment to Vercel, Cloudflare Workers, and any Node.js server. The framework includes built-in observability hooks, error handling, and structured logging for production monitoring.


#Mastra #TypeScript #AIAgents #Framework #ToolCalling #AgentMemory #AgenticAI #LearnAI #AIEngineering

Share this article
C

CallSphere Team

Expert insights on AI voice agents and customer communication automation.

Try CallSphere AI Voice Agents

See how AI voice agents work for your industry. Live demo available -- no signup required.