Control two robotic arms using your hands! This project combines MediaPipe hand tracking with MoveIt Servo for smooth, intuitive dual-arm panda robot control.
Dual_Arm_Demo.mp4
The video above shows dual-arm hand tracking and robot control in action. MediaPipe detects hand poses, which are mapped to robot arm movements using MoveIt Servo. Both arms and grippers respond to natural hand gestures.
- Real-time hand detection (MediaPipe)
- Simultaneous dual-arm control in 3D space
- Gesture-based gripper control
- Smooth, responsive motion (MoveIt Servo)
You can run this project either natively on Ubuntu or inside a Docker container. Choose the method that best fits your needs:
-
Install MoveIt 2, controllers, and Panda description dependencies:
sudo apt install \ ros-humble-moveit \ ros-humble-moveit-servo \ ros-humble-moveit-visual-tools \ ros-humble-ros2-control \ ros-humble-ros2-controllers \ ros-humble-moveit-resources-panda-description
-
Create and source a ROS 2 workspace (if you don't have one):
mkdir -p ~/ros2_ws/src cd ~/ros2_ws source /opt/ros/humble/setup.bash
-
Clone this repo in the
srcfolder and build:cd ~/ros2_ws/src git clone https://github.com/Nabil-Miri/mediapipe_dual_arm_control.git cd .. colcon build
-
Install Python dependencies:
pip3 install -r src/mediapipe_dual_arm_control/requirements.txt
-
Launch the system:
# Terminal 1: Source ROS and launch robot system source /opt/ros/humble/setup.bash source install/setup.bash ros2 launch mediapipe_dual_arm_control dual_arm_teleop.launch.py # Terminal 2: Source ROS and start hand tracking source /opt/ros/humble/setup.bash source install/setup.bash python3 src/mediapipe_dual_arm_control/scripts/hand_pose_publisher_node.py
-
Build the Docker image (from the project root):
docker build -f docker/Dockerfile -t dual-arm-teleop:humble . -
Run the Docker container:
xhost +local:root docker run -it --rm \ --name dual-arm-teleop-container \ --net=host \ --env="DISPLAY" \ --env="QT_X11_NO_MITSHM=1" \ --volume="/tmp/.X11-unix:/tmp/.X11-unix:rw" \ --device=/dev/video0 \ dual-arm-teleop:humble- Adjust
--device=/dev/video0if your camera is on a different device.
- Adjust
-
Open a second terminal in the running container:
docker exec -it dual-arm-teleop-container bash -
Launch the system inside the container:
# Terminal 1: Launch robot system source /opt/ros/humble/setup.bash source install/setup.bash ros2 launch mediapipe_dual_arm_control dual_arm_teleop.launch.py # Terminal 2: Start hand tracking source /opt/ros/humble/setup.bash source install/setup.bash python3 src/mediapipe_dual_arm_control/scripts/hand_pose_publisher_node.py
- Python node: Tracks hands, counts fingers, publishes target poses and gripper commands for each arm
- C++ nodes: Subscribe to hand poses, validate workspace, control each arm and gripper independently using MoveIt Servo
- ROS2 topics:
/left_hand_target_pose,/right_hand_target_pose,/left_hand_gripper_control,/right_hand_gripper_control
- ROS 2 Humble (Ubuntu 22.04)
- Python 3.8+
- Camera
mediapipe_dual_arm_control/ # Main ROS2 package
├── scripts/
│ └── mediapipe_dual_arm_coordinator.py # Python hand tracking node
├── src/
│ └── dual_arm_servo_node.cpp # C++ dual-arm servo node
├── launch/
│ └── dual_arm_teleop.launch.py # Main launch file
├── config/
│ ├── robot/ # Robot URDF, SRDF, controllers, kinematics, etc.
│ ├── planning/ # Planning configs
│ ├── pose_tracking/ # Pose tracking configs
│ ├── rviz/ # RViz config
│ └── hand/ # Hand xacro
├── requirements.txt # Python dependencies
├── docker/
│ └── Dockerfile # Docker setup for reproducibility
└── ... # Other package files
- Camera not detected: Check permissions/connections
- Robot not moving: Verify controllers are loaded
- Build fails: Try
--parallel-workers 1and ensure enough RAM - ROS2 not sourced: Run
source /opt/ros/humble/setup.bash
| Topic | Description |
|---|---|
/left_hand_target_pose |
Target position for left arm |
/right_hand_target_pose |
Target position for right arm |
/left_hand_gripper_control |
Open/close left gripper |
/right_hand_gripper_control |
Open/close right gripper |
- Dual-Arm Robot: Use a dual arm robot like FR3 DUO rather than 2 separate manipulators.
- Gesture Start/Stop: Use hand gestures to start or stop the system.
- Better Depth: Add monocular depth estimation for more accurate hand positions.
- Smoother Motion: Improve filtering for steadier robot vision pose targets.
Feel free to fork, modify, and submit PRs! Suggestions and improvements are welcome.