Model Context Protocol (MCP)

The standardized protocol for AI models to communicate with tools and services. Build powerful AI applications with OpenAI, Anthropic, and other LLM providers using a unified interface.

MCP Server

const client = await experimental_createMCPClient({
transport: {
type: 'sse',
url: 'https://your-mcp-server.com/api'
}
});

Tool Execution

const tools = await client.tools();
const response = await generateText({
model: openai('gpt-4o'),
tools,
prompt: 'Query the data'
});

What is MCP?

The Model Context Protocol (MCP) is a standardized way for AI models to communicate with tools and services, enabling powerful AI applications with consistent tool integration across different LLM providers.

Standardized Communication
Consistent interface for AI models

With MCP, developers can create tools that work seamlessly with multiple AI models from providers like OpenAI and Anthropic, using a single protocol and JSON schema definition.

Tool Integration
Connect AI models to external services

MCP allows AI models to call external tools during generation, enabling them to access real-time data, perform calculations, and interact with external systems through a unified interface.

AI SDK Integration
Seamless integration with the AI SDK

The AI SDK provides a lightweight client that exposes a `tools` method for retrieving tools from an MCP server, making integration simple and efficient for any LLM-powered application.

Multiple Transport Options
Flexible communication methods

MCP supports various transport methods including Server-Sent Events (SSE) and stdio, allowing for different deployment scenarios and use cases in both web and local environments.

Example MCP Tool Flow

MCP Server registers tools
AI SDK client connects to MCP server
AI model calls tools during generation
Tool results are returned to the model

How MCP Works

MCP provides a standardized way for AI models to discover, call, and receive results from external tools through a consistent protocol and schema definition.

  • 1

    Create an MCP server that exposes tools via a standardized interface with JSON schema definitions

  • 2

    Connect to the MCP server using the AI SDK's experimental_createMCPClient function

  • 3

    Retrieve available tools using the client.tools() method for use with any LLM

  • 4

    Pass the tools to AI SDK functions like generateText or streamText for model integration

  • 5

    The AI model can now call these tools during generation to access external functionality through a unified interface

Key MCP Features

Discover the powerful capabilities that make MCP the ideal solution for AI tool integration

JSON Schema Tool Definitions
Standardized tool interfaces

MCP uses JSON Schema to define tool interfaces, ensuring consistent tool definitions across different AI models and providers.

Learn about tool schemas →
Multi-Provider Compatibility
Works with multiple LLM providers

MCP tools work seamlessly with OpenAI, Anthropic, and other LLM providers through a unified interface, reducing integration complexity.

View supported providers →
AI SDK Integration
Seamless AI SDK compatibility

MCP integrates perfectly with the AI SDK, allowing for easy tool discovery and usage in any AI-powered application.

Explore AI SDK integration →

Code Examples

Get started quickly with these code examples for integrating MCP in your applications

Creating an MCP Client
import { experimental_createMCPClient } from 'ai';

// Connect to an SSE MCP server
const client = await experimental_createMCPClient({
transport: {
type: 'sse',
url: 'https://your-mcp-server.com/api',
}
});

// Get available tools
const tools = await client.tools();

// Don't forget to close the client when done
await client.close();
Using MCP Tools with AI SDK
import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';
import { experimental_createMCPClient } from 'ai';

// Connect to MCP server
const client = await experimental_createMCPClient({
transport: {
type: 'sse',
url: 'https://your-mcp-server.com/api',
}
});

try {
// Get tools from the server
const tools = await client.tools();

// Use tools with AI model
const response = await generateText({
model: openai('gpt-4o'),
tools,
prompt: 'Query the database for recent orders',
});
Creating a Custom MCP Tool
import { createTool } from 'ai';

// Define a custom database query tool
const databaseQueryTool = createTool({
name: 'database_query',
description: 'Query the database for information',
parameters: {
type: 'object',
properties: {
query: {
type: 'string',
description: 'The SQL query to execute'
}
},
required: ['query']
},
handler: async function(params) {
// Execute the query (simplified example)
const results = await executeQuery(params.query);
return { results };
}
});
Setting Up an MCP Server
import { createMCPServer } from 'ai/mcp';
import { databaseQueryTool, weatherTool } from './tools';

// Create an MCP server with the tools
const server = createMCPServer({
tools: [databaseQueryTool, weatherTool],
onError: function(error) {
console.error('MCP Server error:', error);
}
});

Popular Use Cases

Discover how developers are using MCP to build powerful AI applications

AI-Powered Database Assistants

Create natural language interfaces for database queries, allowing users to ask questions in plain English and get data from your database.

View database query example →
API-Connected Chatbots

Build chatbots that can access external APIs to provide real-time information like weather, stocks, or sports scores during conversations.

View API integration example →
Multi-Tool AI Assistants

Create powerful AI assistants that combine multiple tools to perform complex tasks like scheduling, data analysis, and content generation.

View multi-tool example →

Ready to Get Started?

Build powerful AI applications with MCP and the AI SDK today