Skip to content

core (ai/mcp): update experimental MCP client documentation for Streamable HTTP transport usage #5972

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 24 commits into from
Apr 30, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 20 additions & 7 deletions content/cookbook/01-next/73-mcp-tools.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -12,10 +12,15 @@ The AI SDK supports Model Context Protocol (MCP) tools by offering a lightweight

Let's create a route handler for `/api/completion` that will generate text based on the input prompt and MCP tools that can be called at any time during a generation. The route will call the `streamText` function from the `ai` module, which will then generate text based on the input prompt and stream it to the client.

To use the `StreamableHTTPClientTransport`, you will need to install the official Typescript SDK for Model Context Protocol:

<Snippet text="pnpm install @modelcontextprotocol/sdk" />

```ts filename="app/api/completion/route.ts"
import { experimental_createMCPClient, streamText } from 'ai';
import { Experimental_StdioMCPTransport } from 'ai/mcp-stdio';
import { openai } from '@ai-sdk/openai';
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp';

export async function POST(req: Request) {
const { prompt }: { prompt: string } = await req.json();
Expand All @@ -38,17 +43,17 @@ export async function POST(req: Request) {
},
});

// Similarly to the stdio example, you can pass in your own custom transport as long as it implements the `MCPTransport` interface:
const transport = new MyCustomTransport({
// ...
});
const customTransportClient = await experimental_createMCPClient({
// Similarly to the stdio example, you can pass in your own custom transport as long as it implements the `MCPTransport` interface (e.g. `StreamableHTTPClientTransport`):
const transport = new StreamableHTTPClientTransport(
new URL('http://localhost:3000/mcp'),
);
const customClient = await experimental_createMCPClient({
transport,
});

const toolSetOne = await stdioClient.tools();
const toolSetTwo = await sseClient.tools();
const toolSetThree = await customTransportClient.tools();
const toolSetThree = await customClient.tools();
const tools = {
...toolSetOne,
...toolSetTwo,
Expand All @@ -63,7 +68,15 @@ export async function POST(req: Request) {
onFinish: async () => {
await stdioClient.close();
await sseClient.close();
await customTransportClient.close();
await customClient.close();
},
// Closing clients onError is optional
// - Closing: Immediately frees resources, prevents hanging connections
// - Not closing: Keeps connection open for retries
onError: async error => {
await stdioClient.close();
await sseClient.close();
await customClient.close();
},
});

Expand Down
22 changes: 17 additions & 5 deletions content/docs/03-ai-sdk-core/15-tools-and-tool-calling.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -679,7 +679,7 @@ Create an MCP client using either:

- `SSE` (Server-Sent Events): Uses HTTP-based real-time communication, better suited for remote servers that need to send data over the network
- `stdio`: Uses standard input and output streams for communication, ideal for local tool servers running on the same machine (like CLI tools or local services)
- Custom transport: Bring your own transport by implementing the `MCPTransport` interface
- Custom transport: Bring your own transport by implementing the `MCPTransport` interface, ideal when implementing transports from MCP's official Typescript SDK (e.g. `StreamableHTTPClientTransport`)

#### SSE Transport

Expand Down Expand Up @@ -719,18 +719,30 @@ const mcpClient = await createMCPClient({

#### Custom Transport

You can also bring your own transport by implementing the `MCPTransport` interface:
You can also bring your own transport, as long as it implements the `MCPTransport` interface. Below is an example of using the new `StreamableHTTPClientTransport` from MCP's official Typescript SDK:

```typescript
import { MCPTransport, createMCPClient } from 'ai';
import {
MCPTransport,
experimental_createMCPClient as createMCPClient,
} from 'ai';
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp';

const url = new URL('http://localhost:3000/mcp');
const mcpClient = await createMCPClient({
transport: new MyCustomTransport({
// ...
transport: new StreamableHTTPClientTransport(url, {
sessionId: 'session_123',
}),
});
```

<Note>
The client returned by the `experimental_createMCPClient` function is a
lightweight client intended for use in tool conversion. It currently does not
support all features of the full MCP client, such as: authorization, session
management, resumable streams, and receiving notifications.
</Note>

#### Closing the MCP Client

After initialization, you should close the MCP client based on your usage pattern:
Expand Down
16 changes: 15 additions & 1 deletion examples/mcp/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,20 @@ pnpm install
pnpm build
```

## Streamable HTTP Transport (Stateful)

Start server

```sh
pnpm http:server
```

Run example:

```sh
pnpm http:client
```

## Stdio Transport

Build
Expand All @@ -32,7 +46,7 @@ Run example:
pnpm stdio:client
```

## SSE Transport
## SSE Transport (Legacy)

Start server

Expand Down
6 changes: 4 additions & 2 deletions examples/mcp/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -7,13 +7,15 @@
"sse:client": "tsx src/sse/client.ts",
"stdio:build": "tsc src/stdio/server.ts --outDir src/stdio/dist --target es2023 --module nodenext",
"stdio:client": "tsx src/stdio/client.ts",
"http:server": "tsx src/http/server.ts",
"http:client": "tsx src/http/client.ts",
"custom-transport:build": "tsc src/custom-transport/server.ts --outDir src/custom-transport/dist --target es2023 --module nodenext",
"custom-transport:client": "tsx src/custom-transport/client.ts",
"type-check": "tsc --noEmit"
},
"dependencies": {
"@modelcontextprotocol/sdk": "1.10.2",
"@ai-sdk/openai": "1.3.20",
"@modelcontextprotocol/sdk": "^1.7.0",
"ai": "4.3.11",
"dotenv": "16.4.5",
"express": "5.0.1",
Expand All @@ -25,4 +27,4 @@
"tsx": "4.19.2",
"typescript": "5.6.3"
}
}
}
37 changes: 37 additions & 0 deletions examples/mcp/src/http/client.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
import { openai } from '@ai-sdk/openai';
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp';
import { experimental_createMCPClient, generateText } from 'ai';
import 'dotenv/config';

async function main() {
const transport = new StreamableHTTPClientTransport(
new URL('http://localhost:3000/mcp'),
);

const mcpClient = await experimental_createMCPClient({
transport,
});

try {
const tools = await mcpClient.tools();

const { text: answer } = await generateText({
model: openai('gpt-4o-mini'),
tools,
maxSteps: 10,
onStepFinish: async ({ toolResults }) => {
console.log(`STEP RESULTS: ${JSON.stringify(toolResults, null, 2)}`);
},
system: 'You are a helpful chatbot',
prompt: 'Look up information about user with the ID foo_123',
});

console.log(`FINAL ANSWER: ${answer}`);
} catch (error) {
console.error('Error:', error);
} finally {
await mcpClient.close();
}
}

main();
105 changes: 105 additions & 0 deletions examples/mcp/src/http/server.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,105 @@
import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
import { StreamableHTTPServerTransport } from '@modelcontextprotocol/sdk/server/streamableHttp.js';
import express from 'express';
import { z } from 'zod';

// Stateless Mode: see https://github.com/modelcontextprotocol/typescript-sdk/tree/main/src/examples#stateless-mode for more details

const app = express();
app.use(express.json());

app.post('/mcp', async (req, res) => {
const server = new McpServer({
name: 'example-http-server',
version: '1.0.0',
});

server.tool(
'get-user-info',
'Get user info',
{
userId: z.string(),
},
async ({ userId }) => {
return {
content: [
{
type: 'text',
text: `Here is information about user ${userId}:`,
},
{
type: 'text',
text: `Name: John Doe`,
},
{
type: 'text',
text: `Email: [email protected]`,
},
{
type: 'text',
text: `Age: 30`,
},
],
};
},
);

try {
const transport = new StreamableHTTPServerTransport({
sessionIdGenerator: undefined,
});
await server.connect(transport);
await transport.handleRequest(req, res, req.body);
res.on('close', () => {
transport.close();
server.close();
});
} catch (error) {
console.error('Error handling MCP request:', error);
if (!res.headersSent) {
res.status(500).json({
jsonrpc: '2.0',
error: {
code: -32603,
message: 'Internal server error',
},
id: null,
});
}
}
});

app.get('/mcp', async (_req, res) => {
console.log('Received GET MCP request');
res.writeHead(405).end(
JSON.stringify({
jsonrpc: '2.0',
error: {
code: -32000,
message: 'Method not allowed.',
},
id: null,
}),
);
});

app.delete('/mcp', async (_req, res) => {
console.log('Received DELETE MCP request');
res.writeHead(405).end(
JSON.stringify({
jsonrpc: '2.0',
error: {
code: -32000,
message: 'Method not allowed.',
},
id: null,
}),
);
});

app.listen(3000);

process.on('SIGINT', async () => {
console.log('Shutting down server...');
process.exit(0);
});
35 changes: 0 additions & 35 deletions examples/next-openai/app/api/mcp/route.ts

This file was deleted.

42 changes: 42 additions & 0 deletions examples/next-openai/app/mcp/chat/route.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
import { openai } from '@ai-sdk/openai';
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp';
import { experimental_createMCPClient, streamText } from 'ai';

export async function POST(req: Request) {
const url = new URL('http://localhost:3000/mcp/server');
const transport = new StreamableHTTPClientTransport(url);

const [client, { messages }] = await Promise.all([
experimental_createMCPClient({
transport,
}),
req.json(),
]);

try {
const tools = await client.tools();

const result = streamText({
model: openai('gpt-4o-mini'),
tools,
maxSteps: 5,
onStepFinish: async ({ toolResults }) => {
console.log(`STEP RESULTS: ${JSON.stringify(toolResults, null, 2)}`);
},
system: 'You are a helpful chatbot capable of basic arithmetic problems',
messages,
onFinish: async () => {
await client.close();
},
// Optional, enables immediate clean up of resources but connection will not be retained for retries:
// onError: async error => {
// await client.close();
// },
});

return result.toDataStreamResponse();
} catch (error) {
console.error(error);
return Response.json({ error: 'Unexpected error' }, { status: 500 });
}
}
Loading
Loading