vS-Graphs is inspired by LiDAR S-Graphs and extends ORB-SLAM 3.0 by integrating optimizable 3D scene graphs, enhancing mapping and localization accuracy through scene understanding. It improves scene representation with building components (i.e., wall and ground surfaces) and infering structural elements (i.e., rooms and corridors), making SLAM more robust and efficient.
Below diagram shows the detailed architecture of the vS-Graphs framework, highlighting the key threads and their interactions. Modules with a light gray background are inherited directly from the baseline (ORB-SLAM 3.0), while the remaining components are newly added or modified components.
For system requirements, dependencies, and setup instructions, refer to the Installation Guide.
You can read about the SLAM-related configuration parameters (independent of the ROS2
wrapper) in the config folder. These configurations can be modified in the system_params.yaml file. For more information on ROS-related configurations and usage, see the ROS parameter documentation page.
Once you have installed the required dependencies and configured the parameters, you are ready to run vS-Graphs! Follow the steps below to get started:
-
Source vS-Graphs and run it by
ros2 launch vs_graphs vsgraphs_rgbd.launch.py
. It will automatically run the vS-Graphs core and the semantic segmentation module for building component (walls and ground surfaces) recognition. -
(Optional) If you intend to detect structural elements (rooms and corridors) too, run the cluster-based solution using
ros2 launch voxblox_skeleton skeletonize_map_vsgraphs.launch 2>/dev/null
.- In this case, you need to source
voxblox
with a--extend
command, and then launch the framework:
source /opt/ros/jazzy/setup.bash && source ~/[VSGRAPHS_PATH]/install/setup.bash && source ~/[VOXBLOX_PATH]/install/setup.bash --extend && ros2 launch vs_graphs vsgraphs_rgbd.launch.py
- In this case, you need to source
-
(Optional) If you have a database of ArUco markers representing room/corridor labels, do not forget to run
aruco_ros
usingros2 launch aruco_ros marker_publisher.launch
. -
Now, play a recorded
bag
file by runningros2 bag play [sample].bag --clock
.
✨ For a complete list of configurable launch arguments, check the Launch Parameters.
✨ For detailed description on how to use a RealSense D400 series camera for live feed and data collection, check this page.
🛎️ Note: The current version of vS-Graphs supports ROS2 Jazzy and is primarily tested on Ubuntu 24.04.2 LTS.
For a fully reproducible and environment-independent setup, check the Docker section.
To evaluate vS-Graphs against other visual SLAM frameworks, read the evaluation and benchmarking documentation.
@article{tourani2025vsgraphs,
title={vS-Graphs: Integrating Visual SLAM and Situational Graphs through Multi-level Scene Understanding},
author={Tourani, Ali and Ejaz, Saad and Bavle, Hriday and Morilla-Cabello, David and Sanchez-Lopez, Jose Luis and Voos, Holger},
journal={arXiv preprint arXiv:2503.01783},
year={2025},
doi={https://doi.org/10.48550/arXiv.2503.01783}
}
- 🔧 LiDAR S-Graphs
- 🎞️ Scene Segmentor (ROS2 Jazzy)
This project is licensed under the GPL-3.0 license - see the LICENSE for more details.