Skip to content

Conversation

@haixuanTao
Copy link

This is a quick rewrite of our existing openai realtime script as an mcp server that can be configured with:

{
  "servers": {
    "reachy": {
      "command": "uv",
      "args": [
        "--directory",
        "/path/to/reachy_mini_conversation_demo",
        "run",
        "reachy-mini-mcp"
      ],
    }
  }
}

And then let the llm mcp tool call Reachy Mini. :)

I have reduced the number of tools in order for lower latency on local llm but we can add more in the future!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants