This paper has been accepted to IEEE Robotics and Automation Letters, and it is available at https://arxiv.org/abs/2503.12101
Branch note (
unitree_sdk): this branch is developed for full compatibility with the Unitree SDK and ROS2 communication. The code has been tested on a Unitree Go2 robot in both simulation and real-world experiments.
The muse package provides a ROS node and utilities for estimating the state of a quadruped robot using sensor data. It includes algorithms for state estimation, sensor fusion, and filtering.
This version of the code provides a proprioceptive state estimator for quadruped robots. The necessary inputs are
- imu measurements
- joint states
- force exerted on the feet
To simplify setup, we provide a ready-to-use Conda environment so you do not have to manually resolve dependencies.
If you want a running simulation, follow the instructions in basic-locomotion-dls-isaaclab.
To run this code with Unitree robots, you need to port the URDF of your robot in this folder and connect to the robot through unitree_ros2_dls.
For real-world experiments, we recommend the following repositories to control your robot:
To install and run muse with Conda + ROS2:
-
Clone and create the environment:
git clone https://github.com/iit-DLSLab/muse.git -b unitree_sdk cd muse conda env create -f environment.yml conda activate muse-ros2 -
Fetch external dependencies (e.g. Point-LIO):
cd muse_ws vcs import src < muse.repos
-
Build with
colcon:cd muse_ws colcon build --symlink-install source install/setup.bash
-
Launch the state estimator package:
ros2 launch state_estimator state_estimator.launch.py
Or launch the full stack with Point-LIO (see Point-LIO integration):
ros2 launch muse_point_lio muse_with_point_lio.launch.py
To change the name of the topics, check the config folder.
To visualize your data, you can use PlotJuggler:
ros2 run plotjuggler plotjugglerMUSE includes a muse_point_lio wrapper package that integrates Point-LIO ROS2 as the exteroceptive odometry source for the MultiSensorFusion plugin.
[Point-LIO node] → /point_lio/odometry
↓
[odom_bridge node] → /lidar_odometry (normalised interface)
↓
[MultiSensorFusion plugin] → /muse/multi_sensor_fusion
↓
[TfStatePublisher plugin] → TF: world → base
A ready-to-use Point-LIO config for the Go2 is provided at config/go2_muse.yaml in the muse-integration branch of the forked Point-LIO repo. It sets:
- Input topics:
/utlidar/cloudand/utlidar/imu(Unitree SDK2 native driver) - Extrinsics derived from the Go2 URDF (
radar_joint→imu_joint) - Frame IDs:
odom_header_frame_id: odom,odom_child_frame_id: imu
The Point-LIO fork is fetched automatically via muse.repos (see build instructions above).
Pass point_lio_launch_file to switch to any other Point-LIO launch file:
ros2 launch muse_point_lio muse_with_point_lio.launch.py \
point_lio_launch_file:=mapping_velody16.launch.py- Extend the code to include exteroception (Point-LIO integration via
muse_point_lio) - Conda-based environment
- Support for ROS2 (on going)
Contributions to this repository are welcome.
If you like this work and would like to cite it (thanks):
@ARTICLE{10933515,
author={Nisticò, Ylenia and Soares, João Carlos Virgolino and Amatucci, Lorenzo and Fink, Geoff and Semini, Claudio},
journal={IEEE Robotics and Automation Letters},
title={MUSE: A Real-Time Multi-Sensor State Estimator for Quadruped Robots},
year={2025},
volume={10},
number={5},
pages={4620-4627},
keywords={Robots;Sensors;Robot sensing systems;Legged locomotion;Odometry;Cameras;Laser radar;Robot vision systems;Robot kinematics;Quadrupedal robots;State estimation;localization;sensor fusion;quadruped robots},
doi={10.1109/LRA.2025.3553047}}
This repo is maintained by
Ylenia Nisticò