Skip to content

feat: configure MCP servers in settings page #896

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

tobiasbueschel
Copy link

@tobiasbueschel tobiasbueschel commented Apr 1, 2025

This is a super early (+ very rough) implementation of MCPs for the ai-chatbot template. I've only had 30 minutes and tried to see how far I could get using the new Gemini 2.5 and Sonnet 3.7 models, hence, here is the result. This resolves the following issue: #575

The PR adds the following:

  • Allows users to add/remove/edit MCP servers in the settings page of the app
  • These can be either stdio if running locally or sse (we might want to disable stdio altogether for a deployed version)

Here are some screenshots of how it looks like at the moment:

Chat page
image

Server logs showing output of MPC call
image

Settings page
image

Adding new MCPs
image

Remaining tasks

  • clean up code
  • add test cases & especially ensuring secure handling of MCPs
  • add documentation on how to use it
  • add an example MCP
  • do we want to add an MCP store or a link to one so that users can easily find available MCPs?
  • fetching of MCP tools schema is not working yet
  • UI/UX needs to be improved (e.g., better loading messages for MCP tool executions + show results in the chat)
  • Investigate whether the system prompt needs to be updated to better support MCPs
  • what else?

⚠️ I'm surprised how far this implementation got using "vibe coding", but of course this PR is far from ready to be merged! I'm not yet happy with the UI/UX and the code is mainly AI-generated and needs to be fully checked, optimized and so on.

Hoping to find some time to hand-craft this properly over the weekend, but raising this PR for early feedback and for anyone who would like to help contribute to this!

Cheers


References

Copy link

vercel bot commented Apr 1, 2025

@tobiasbueschel is attempting to deploy a commit to the Vercel Team on Vercel.

A member of the Team first needs to authorize it.

Copy link

socket-security bot commented Apr 1, 2025

New, updated, and removed dependencies detected. Learn more about Socket for GitHub ↗︎

Package New capabilities Transitives Size Publisher
npm/@ai-sdk/[email protected]1.2.7 Transitive: environment, network +5 4.77 MB vercel-release-bot
npm/@ai-sdk/[email protected] network Transitive: environment +8 5.45 MB vercel-release-bot
npm/@ai-sdk/[email protected]1.2.9 Transitive: environment, network +6 5.01 MB vercel-release-bot
npm/@codemirror/[email protected]6.36.5 None +2 1.2 MB adrianheine, marijn
npm/@playwright/[email protected]1.51.1 None +2 0 B
npm/@radix-ui/[email protected] None +12 299 kB chancestrickland
npm/@radix-ui/[email protected] None +9 141 kB andy-hook, benoitgrelard, chancestrickland, ...3 more
npm/@types/[email protected]22.14.0 None +1 2.42 MB types
npm/@types/[email protected]1.1.5 None 0 3.88 kB types
npm/@types/[email protected]18.3.6 None 0 38.2 kB types
npm/@types/[email protected]18.3.20 None +2 1.69 MB types
npm/[email protected]3.10.0 Transitive: environment, filesystem, unsafe +12 496 kB alexgorbatchev, bradzacher, jounqin
npm/[email protected], 5.1.35.1.5 None 0 12.1 kB ai
npm/[email protected]1.5.0 None +1 364 kB marijn
npm/[email protected]1.13.2 Transitive: environment, filesystem +10 1.75 MB marijn
npm/[email protected]1.25.0 None 0 524 kB marijn
npm/[email protected]1.2.4 None 0 30 kB marijn
npm/[email protected]1.39.1 None +1 1.19 MB marijn
npm/[email protected]5.8.3 None 0 22.9 MB typescript-bot

View full report↗︎

@yamz8
Copy link

yamz8 commented Apr 1, 2025

lets gooo!

@pmarquees
Copy link

This is sick, we need this!

# Conflicts:
#	lib/db/queries.ts
#	pnpm-lock.yaml
@cgoinglove
Copy link

While I do hope this PR gets accepted, I believe it may be difficult due to some practical limitations.

Why it might not be a good fit for the template:
• Vercel uses a serverless architecture, which doesn’t support long-running processes like stdio-based MCP servers.
• Many MCP tools require access to the file system or a persistent environment, which conflicts with Vercel’s stateless and cold-start behavior.
• The template is designed specifically for Vercel-first deployment, and integrating MCP introduces structural complexity and potential compatibility issues.

That’s why I decided to build a dedicated MCP client project based on what I learned here.

👉 MCP Client Chatbot
• Built with Next.js and Vercel AI SDK
• File-based MCP configuration
• Supports multiple providers like OpenAI, Anthropic, Google, Ollama, etc.

Hope it’s helpful—and if you have feedback or ideas, I’d love to hear them. Thanks!

https://github.com/cgoinglove/mcp-client-chatbot

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants