Skip to content

snt-arg/visual_sgraphs

Repository files navigation

Visual S-Graphs (vS-Graphs)

vS-Graphs

arXiv License: GPL v3

vS-Graphs is inspired by LiDAR S-Graphs and extends ORB-SLAM 3.0 by integrating optimizable 3D scene graphs, enhancing mapping and localization accuracy through scene understanding. It improves scene representation with building components (i.e., wall and ground surfaces) and infering structural elements (i.e., rooms and corridors), making SLAM more robust and efficient.

🧠 vS-Graphs Architecture

Below diagram shows the detailed architecture of the vS-Graphs framework, highlighting the key threads and their interactions. Modules with a light gray background are inherited directly from the baseline (ORB-SLAM 3.0), while the remaining components are newly added or modified components.

vS-Graphs Flowchart

⚙️ Prerequisites and Installation

For system requirements, dependencies, and setup instructions, refer to the Installation Guide.

🔨 Configurations

You can read about the SLAM-related configuration parameters (independent of the ROS2 wrapper) in the config folder. These configurations can be modified in the system_params.yaml file. For more information on ROS-related configurations and usage, see the ROS parameter documentation page.

🚀 Getting Started

Once you have installed the required dependencies and configured the parameters, you are ready to run vS-Graphs! Follow the steps below to get started:

  1. Source vS-Graphs and run it by ros2 launch vs_graphs vsgraphs_rgbd.launch.py. It will automatically run the vS-Graphs core and the semantic segmentation module for building component (walls and ground surfaces) recognition.

  2. (Optional) If you intend to detect structural elements (rooms and corridors) too, run the cluster-based solution using ros2 launch voxblox_skeleton skeletonize_map_vsgraphs.launch 2>/dev/null.

    • In this case, you need to source voxblox with a --extend command, and then launch the framework:
    source /opt/ros/jazzy/setup.bash &&
    source ~/[VSGRAPHS_PATH]/install/setup.bash &&
    source ~/[VOXBLOX_PATH]/install/setup.bash --extend &&
    ros2 launch vs_graphs vsgraphs_rgbd.launch.py
  3. (Optional) If you have a database of ArUco markers representing room/corridor labels, do not forget to run aruco_ros using ros2 launch aruco_ros marker_publisher.launch.

  4. Now, play a recorded bag file by running ros2 bag play [sample].bag --clock.

✨ For a complete list of configurable launch arguments, check the Launch Parameters.

✨ For detailed description on how to use a RealSense D400 series camera for live feed and data collection, check this page.

🛎️ Note: The current version of vS-Graphs supports ROS2 Jazzy and is primarily tested on Ubuntu 24.04.2 LTS.

🐋 Docker

For a fully reproducible and environment-independent setup, check the Docker section.

📏 Benchmarking

To evaluate vS-Graphs against other visual SLAM frameworks, read the evaluation and benchmarking documentation.

📚 Citation

@article{tourani2025vsgraphs,
  title={vS-Graphs: Integrating Visual SLAM and Situational Graphs through Multi-level Scene Understanding},
  author={Tourani, Ali and Ejaz, Saad and Bavle, Hriday and Morilla-Cabello, David and Sanchez-Lopez, Jose Luis and Voos, Holger},
  journal={arXiv preprint arXiv:2503.01783},
  year={2025},
  doi={https://doi.org/10.48550/arXiv.2503.01783}
}

📎 Related Repositories

🔑 License

This project is licensed under the GPL-3.0 license - see the LICENSE for more details.

Contributors 6

Languages