This repository contains the software to control the simulated and real RePAIR robot.
- catkin
- xacro
- Xbot2
- Softhand-plugin
- roboticsgroup_upatras_gazebo_plugins
- gazebo2rviz
- pysdf
- [realsense] (https://github.com/issaiass/realsense2_description)
- moveit
- install ddynamical reconfigure
sudo apt-get install ros-noetic-ddynamic-reconfigure
- realsense2_camera is available as a debian package of ROS distribution. It can be installed by typing:
sudo apt-get install ros-$ROS_DISTRO-realsense2-camera
- realsense_gazebo_plugin
-
First, clone the repository and its submodules:
mkdir -p ~/repair_robot_ws/src && cd ~/repair_robot_ws/src git clone --recurse-submodules -j8 https://github.com/RePAIRProject/repair_ros_robot.git git clone https://github.com/RePAIRProject/repair_motion_controller
-
Download fresco 3D models from Nextcloud to
src/repair_ros_robot/repair_urdf/sdf
. As of now some Frescos might need their own urdf which should just be copy paste of names. -
Download fresco recognition models and placing sequence from Nextcloud to
src/repair_ros_robot/repair_interface/sand_detection_models
.
We provide Docker-based installation instructions compatible with Visual Studio Code's Dev Containers. However, these steps mostly also apply to standard Docker usage.
Prerequisites: Complete the general installation above before proceeding.
In a terminal, run: xhost +local:docker
To enable visual outputs via GPU, install the following (if not already present):
sudo apt-get install -y nvidia-container-toolkit
sudo nvidia-ctk runtime configure --runtime=docker
sudo systemctl restart docker
Our work is based on this ROS1 Docker Container. Either create a new .devcontainer
folder or copy the one from the before mentioned ROS1 Docker setup into ~/repair_robot_ws
. Then, replace or add the following example files (provided in this repo) inside .devcontainer
:
example_dockerfile.txt β Dockerfile
example_postcreate.sh β postCreate.sh
example_devcontainer.json β devcontainer.json
requirements.txt
- Install the Dev Containers extension in VS Code.
- Press
Ctrl + Shift + P
and selectDev Containers: Open Folder in Container
- Choose the
~/repair_robot_ws
folder. - VS Code will now build the container. To rebuild it later, repeat the same command and select:
Dev Containers: Rebuild Container
Once inside the container, build your workspace:
cd /home/ws
catkin build
π The XBot installation is handled automatically by the
postCreate.sh
script. However, please add the following commands in order to: 1. source the ROS workspace and 2. source XBot2 in the .bashrc to be able to run ROS and XBot2 commands later in every terminal.
echo "source /home/ws/devel/setup.bash" >> ~/.bashrc
echo ". /opt/xbot/setup.sh" >> ~/.bashrc
source ~/.bashrc
To set the XBot2 configuration use:
set_xbot2_config /home/ws/src/repair_ros_robot/repair_cntrl/config/repair_basic.yaml
- Try to close the Ports in VsCode and or Rebuild your container without cache.
If your Docker terminal lacks color, fix it by adding the following to /root/.bashrc
:
sudo nano /root/.bashrc
# Add the following line
export PS1="\[\e[1;32m\]\u@\h:\[\e[1;34m\]\w\[\e[0m\]\$ "
If you prefer to run the project natively without Docker, follow these steps after completing the General Installation.
cd ~/repair_robot_ws/src/repair_ros_robot
pip3 install -r requirements.txt
Ensure you have sourced your ROS environment in every terminal:
source /opt/ros/noetic/setup.bash
cd ~/repair_robot_ws
catkin build
If you encounter errors like:
CMake Error at /opt/ros/noetic/share/catkin/cmake/catkinConfig.cmake:83 (find_package):
Could not find a package configuration file provided by "package_name" ...
Check the solutions below:
sudo apt-get install ros-noetic-ddynamic-reconfigure
sudo apt-get install ros-noetic-moveit
sudo apt-get install ros-noetic-rviz-visual-tools
sudo apt-get install ros-noetic-moveit-visual-tools
After a successful build:
cd ~/repair_robot_ws
source devel/setup.bash
Follow the official XBot2 installation guide, or run:
# ROS setup
sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu $(lsb_release -sc) main" > /etc/apt/sources.list.d/ros-latest.list'
sudo apt install curl
curl -s https://raw.githubusercontent.com/ros/rosdistro/master/ros.asc | sudo apt-key add -
sudo apt update && sudo apt install -y ros-noetic-ros-base libgazebo11-dev
echo ". /opt/ros/noetic/setup.bash" >> ~/.bashrc
source ~/.bashrc
# Additional ROS and GUI tools
sudo apt install -y \
ros-$ROS_DISTRO-urdf ros-$ROS_DISTRO-kdl-parser \
ros-$ROS_DISTRO-eigen-conversions ros-$ROS_DISTRO-robot-state-publisher ros-$ROS_DISTRO-moveit-core \
ros-$ROS_DISTRO-rviz ros-$ROS_DISTRO-interactive-markers ros-$ROS_DISTRO-tf-conversions ros-$ROS_DISTRO-tf2-eigen \
qttools5-dev libqt5charts5-dev qtdeclarative5-dev
# XBot2 repository setup
sudo sh -c 'echo "deb http://xbot.cloud/xbot2/ubuntu/$(lsb_release -sc) /" > /etc/apt/sources.list.d/xbot-latest.list'
wget -q -O - http://xbot.cloud/xbot2/ubuntu/KEY.gpg | sudo apt-key add -
sudo apt update
sudo apt install xbot2_desktop_full
echo ". /opt/xbot/setup.sh" >> ~/.bashrc
To set the XBot2 configuration:
set_xbot2_config ~/repair_robot_ws/src/repair_ros_robot/repair_cntrl/config/repair_basic.yaml
More Information For additional details on the interface, refer to the repair_interface documentation.
roslaunch repair_gazebo repair_gazebo.launch
- You can safely ignore the following error messages β they pertain to position controllers and missing
p
gains, which are only relevant for effort controllers:[ERROR] No p gain specified for pid. Namespace: /gazebo_ros_control/pid_gains/x_joint
- URDF warnings like:
These do not impact functionality. You may not see hand animations in RViz but simulations work fine.
[ WARN] Link 'right_hand_v1_2_research_thumb_proximal_link' is not known to URDF.
To run the overall fresco manipulation pipeline as devoloped on the real robot, simply follow the following steps.
Run each command in its own split:
roscore
xbot2-core --hw dummy
roslaunch repair_motion_controller bringup_motion_controller.launch
xbot2-gui
xbot2-gui
start homing
and ros_control
! the simulation will not run properly otherwise.
π§ Set the
motion_controller_launch
to use"dummy"
instead of"real"
.
To run the simulation we offer two options, where option 1. is the prefered one:
-
Single combined launch:
roslaunch repair_gazebo repair_gazebo_gazebo.launch
-
Manual launch of each component:
roslaunch repair_gazebo repair_gazebo.launch roslaunch repair_gazebo control_utils.launch /bin/python /home/ws/src/repair_ros_robot/repair_gazebo/src/xbot_to_gazebo.py /bin/python /home/ws/src/repair_ros_robot/repair_gazebo/src/republisher_xbot_to_hand.py
To spawn a Fresco piece inside the Gazebo simulation, run:
/home/ws/src/repair_ros_robot/repair_interface/scripts/launch_fresco.py
Everytime this command is repeated, the fresco will be respawned at the same position
To run the experiment run the following commands in two seperate terminals:
Terminal 1:
/bin/python /home/ws/src/repair_ros_robot/repair_interface/scripts/sand_recognition.py --use_gazebo
Terminal 2:
/bin/python /home/ws/src/repair_ros_robot/repair_interface/scripts/moveit_multi_fresco.py --use_gazebo
Offset values and other parameters for the pipeline are specified in
repair_interface/scripts/configs/gazebo_pipeline_config.yaml
π One can turn on/off the gazing of the inactive Hand towards the active Hand in the repair_motion_control_server.py
You can disable link attachment logic by removing the functions attach_links
and detach_links
in moveit_multi_fresco_cleaned_gazebo.py
.
You can hardcode the result of grasping to True
to bypass grasp simulation.
Use the following code for arm motion:
publish_tf_np(arm_target_pose_np, child_frame='arm_grasp_pose')
self.move_arm(self.arm, arm_target_pose_np)
Reset robot to home pose:
self.go_home_pose()
Controlling the real robot requires XBot2. A dummy mode is also available to emulate the real robot interface β ideal for testing MoveIt and RViz without ros_control
.
Set your environment to connect to the robotβs ROS master. Add this to your .bashrc
:
export ROS_MASTER_URI=http://{robot_IP}:11311
export ROS_IP={local_IP}
Then, source your .bashrc
:
source ~/.bashrc
ssh -X {username}@{robot_IP}
rostopic list
# If not running:
systemctl --user restart roscore.service
ecat_master
xbot2-core --hw ec_pos
# Or for idle mode:
# xbot2-core --hw idle
xbot2-gui
roslaunch repair_motion_controller bringup_motion_controller.launch
roslaunch repair_moveit_xbot bringup_moveit.launch
rosrun repair_interface moveit_client.py
roslaunch realsense2_camera demo_pointcloud_new.launch serial_no:=f1061874
Download Fresco recognition models and run:
rosrun sand_recognition.py
Available Models:
model_name:="best_3pieces_15epochs_larger_batch.pt" # Group 89 (robust, 3 classes)
model_name:="best_mix.pt" # Group 15 and 29
Note:
best_3pieces_15epochs_larger_batch.pt
is robust but supports only 3 IDs.best_g89_15epochs_larger_batch.pt
detects more IDs but may be less reliable. Consider reducing theconf_debug
threshold (line 299) to increase fragment detection.
Run the multi-fragment pick & place pipeline:
rosrun repair_interface moveit_multi_fresco.py
Offset values and other parameters for the pipeline are specified in
repair_interface/scripts/configs/real_pipeline_config.yaml
Note:
Use sh_version
options: v1_2_research
, v1_wide
, mixed_hands
.
Ask Luca Palmieri for the following resources:
- Frescos
RPf_00123
toRPf_001266
should be placed in:repair_ros_robot/repair_urdf/sdf
/home/.gazebo/models
- Fragment database directory
fragments_db
should be added to:/home/.gazebo/
-
List all topics:
rostopic list
-
Send joint commands (excluding SoftHand):
/xbotcore/command
-
Read joint states (excluding SoftHand):
/xbotcore/joint_states
-
SoftHand commands:
/left_hand_v1s/synergy_command /right_hand_v1s/synergy_command
-
Finger states:
/left_hand_v1s/{fingername}_state /right_hand_v1s/{fingername}_state
longest_valid_segment_fraction: 0.00005
In repair_moveit_config_v2/config/joint_limits.yaml
:
default_velocity_scaling_factor: 0.1
default_acceleration_scaling_factor: 0.1
Alternatively, use the Motion Planning tab in RViz.
T.B.A.