This is our implementation to estimate the state of the legged robot's main body with a body mounted IMU and the joint encoders [1, 2]. We use the rosbag (ros2) collected from an unitree go2 robot. We don't rely on a ROS2 environment since we decode the rosbag into numpy data with python.
Jun. 2025 🎉🎉 Our DogLegs paper that adds multiple leg-mounted IMUs to this work has been accepted to IROS 2025. Please condsider cite our paper if you find this project helpful for your research.
@inproceedings{wu2025iros,
author={Wu, Yibin and Kuang, Jian and Khorshidi, Shahram and Niu, Xiaoji and Klingbeil, Lasse and Bennewitz, Maren and Kuhlmann, Heiner},
booktitle={IEEE/RSJ Int. Conf. Intell. Robots Syst. (IROS)},
title={{DogLegs}: Robust Proprioceptive State Estimation for Legged Robots Using Multiple Leg-Mounted IMUs},
year={2025}
}Download the unitree go2 rosbag with this link.
Make sure you have the right python enviroment before running the code.
python -u main.py --config "your\path\to\go2.yaml" --bagpath "your\path\to\rosbag2_2025_02_19-22_46_51 (here is the given test file)" --topic /lowstate
The estimated trajectory for test file should be:
[1] M. Bloesch, M. Hutter, M. A. Hoepflinger, S. Leutenegger, C. Gehring, C. D. Remy, and R. Siegwart, “State estimation for legged robots: Consistent fusion of leg kinematics and IMU,” in Proc. Robot.: Sci. Syst., pp. 17–24, 2013.
[2] M. Bloesch, C. Gehring, P. Fankhauser, M. Hutter, M. A. Hoepflinger, and R. Siegwart, “State estimation for legged robots on unstable and slippery terrain,” in Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., pp. 6058–6064, 2013.