⚠️ Reachy Mini is still in beta. Expect bugs, some of them we won't fix right away if they are not a priority.
Reachy Mini is an expressive, open-source robot designed for human-robot interaction, creative coding, and AI experimentation. We made it to be affordable, easy to use, hackable and cute, so that you can focus on building cool AI applications!
Reachy Mini's hardware comes in two flavors:
- Reachy Mini lite: where the robot is directly connected to your computer via USB. And the code that controls the robot (the daemon) runs on your computer.
- Reachy Mini wireless: where an Raspberry Pi is embedded in the robot, and the code that controls the robot (the daemon) runs on the Raspberry Pi. You can connect to it via Wi-Fi from your computer. (TODO: add link to section on how to set it up)
There is also a simulated version of Reachy Mini in MuJoCo that you can use to prototype your applications before deploying them on the real robot. It behaves like the lite version where the daemon runs on your computer.
Follow our step-by-step Assembly Guide. Most builders finish in about 3 hours, our current speed record is 43 minutes. The guide walks you through every step with clear visuals so you can assemble Reachy Mini confidently from start to finish. Enjoy the build!
This repository provides everything you need to control Reachy Mini, both in simulation and on the real robot. It consists of two main parts:
- The 😈 Daemon 😈: A background service that manages communication with the robot's motors and sensors, or with the simulation environment. It should be running before you can control the robot. It can run either for the simulation (MuJoCo) or for the real robot.
- 🐍 SDK & 🕸️ API to control the robot's main features (head, antennas, camera, speakers, microphone, etc.) and connect with your AI experimentation. Depending on your preferences and needs, there is a Python SDK and a HTTP REST API.
Using the Python SDK, making your robot move only require a few lines of code, as illustrated in the example below:
from reachy_mini import ReachyMini
from reachy_mini.utils import create_head_pose
with ReachyMini() as reachy_mini:
# Move the head up (10mm on z-axis) and roll it 15 degrees
pose = create_head_pose(z=10, roll=15, degrees=True, mm=True)
reachy_mini.goto_target(head=pose, duration=2.0)
# Reset to default pose
pose = create_head_pose()
reachy_mini.goto_target(head=pose, duration=2.0)and using the REST API, reading the current state of the robot:
curl 'http://localhost:8000/api/state/full'Those two examples above assume that the daemon is already running (either in simulation or on the real robot) locally.
As mentioned above, before being able to use the robot, you need to run the daemon that will handle the communication with the motors.
We support and test on Linux and macOS. It's also working on Windows, but it is less tested at the moment. Do not hesitate to open an issue if you encounter any problem.
The daemon is built in Python, so you need to have Python installed on your computer (versions from 3.10 to 3.13 are supported). We recommend using a virtual environment to avoid dependency conflicts with your other Python projects.
You can install Reachy Mini from the source code or from PyPI.
First, make sure git-lfs is installed on your system:
- On Linux:
sudo apt install git-lfs - On macOS:
brew install git-lfs - On Windows: Follow the instructions here
From PyPI, you can install the package with:
pip install reachy-miniFrom the source code, you can install the package with:
git clone https://github.com/pollen-robotics/reachy_mini
pip install -e ./reachy_miniThe same package provides both the daemon and the Python SDK.
Before being able to use the robot, you need to run the daemon that will handle the communication with the motors. This daemon can run either in simulation (MuJoCo) or on the real robot.
reachy-mini-daemonor run it via the Python module:
python -m reachy_mini.daemon.app.mainAdditional argument for both simulation and real robot:
--localhost-only: (default behavior). The server will only accept connections from localhost.or
--no-localhost-only: If set, the server will accept connections from any connection on the local network.In simulation (MuJoCo)
You first have to install the optional dependency mujoco.
pip install reachy-mini[mujoco]Then run the daemon with the --sim argument.
reachy-mini-daemon --simAdditional arguments:
--scene <empty|minimal> : (Default empty). Choose between a basic empty scene, or a scene with a table and some objects.Note: On OSX in order to run mujoco, you need to use mjpython (see here). So, you should run the daemon with:
mjpython -m reachy_mini.daemon.app.main --simIt should automatically detect the serial port of the robot. If it does not, you can specify it manually with the -p option:
reachy-mini-daemon -p <serial_port>For more information about the daemon and its options, you can run:
reachy-mini-daemon --helpYou can access a simple dashboard to monitor the robot's status at http://localhost:8000/ when the daemon is running. This lets you turn your robot on and off, run some basic movements, and browse spaces for Reachy Mini!
Conversational demo for the Reachy Mini robot combining LLM realtime APIs, vision pipelines, and choreographed motion libraries: reachy_mini_conversation_demo.
You can find more awesome apps and demos for Reachy Mini on Hugging Face spaces!
The API is designed to be simple and intuitive. You can control the robot's features such as the head, antennas, camera, speakers, and microphone. For instance, to move the head of the robot, you can use the goto_target method as shown in the example below:
from reachy_mini import ReachyMini
from reachy_mini.utils import create_head_pose
with ReachyMini() as reachy_mini:
# Move the head up (10mm on z-axis) and roll it 15 degrees
pose = create_head_pose(z=10, roll=15, degrees=True, mm=True)
reachy_mini.goto_target(head=pose, duration=2.0)
# Reset to default pose
pose = create_head_pose()
reachy_mini.goto_target(head=pose, duration=2.0)For a full description of the SDK, please refer to the Python SDK documentation.
The daemon also provides a REST API via fastapi that you can use to control the robot and get its state. The API is accessible via HTTP and WebSocket.
By default, the API server runs on http://localhost:8000. The API is documented using OpenAPI, and you can access the documentation at http://localhost:8000/docs when the daemon is running.
More information about the API can be found in the HTTP API documentation.
This project is actively developed and maintained by the Pollen Robotics team and the Hugging Face team.
We welcome contributions from the community! If you want to report a bug or request a feature, please open an issue on GitHub. If you want to contribute code, please fork the repository and submit a pull request.
TODO
Development tools are available in the optional dependencies.
pip install -e .[dev]
pre-commit installYour files will be checked before any commit. Checks may also be manually run with
pre-commit run --all-filesChecks are performed by Ruff. You may want to configure your IDE to support it.
This project is licensed under the Apache 2.0 License - see the LICENSE file for details.
The robot design files are licensed under the TODO license.


