Sharpen Your Robot’s Eyes – Mastering ROS2 Sensor Fusion with EKF & AMCL

When Good Maps Go Wobbly: The Case for Sensor Fusion

So far, your Raspberry Pi mapping robot has drawn its first map with LiDAR using SLAM. But maybe the corners don’t quite match. Maybe the walls jitter, or the robot misplaces itself slightly every few meters. That’s not a failing of your LiDAR, it’s a sign you need ROS2 sensor fusion to strengthen your robot localization strategy.

Welcome to sensor fusion—where data from encoders, IMUs, and other sensors are blended to provide stable, accurate state estimation. In this post, we explore how ROS2’s Extended Kalman Filter (EKF) and Adaptive Monte Carlo Localization (AMCL) work together to minimize drift and enable precise navigation.

1. Why Sensor Fusion Matters in ROS2

Every robot must answer a simple but essential question: “Where am I?”

  • Encoders provide distance via wheel ticks.
  • IMUs track orientation and acceleration.
  • LiDARs see walls, shapes, and space.

Each sensor brings strengths—but also weaknesses. Wheel slippage, gyro drift, and scan mismatches can throw off your robot’s position, especially over time. By fusing sensor data with ROS2’s sensor fusion stack, we create a more reliable and consistent sense of the robot’s location in its environment.

2. Meet the Players

Here’s a quick breakdown of the main sensors involved:

SensorRelatable AnalogyMeasuresCommon Weakness
EncodersPedometerWheel distanceSlips and false ticks
IMUBalance boardTilt, spinDrift over time
LiDARFlashlight scannerDistances to wallsNoisy, requires clean environment

Individually, these sensors falter. Together, they compensate for one another.

3. Install the ROS2 Sensor Fusion Package

To begin, ensure your system includes the robot_localization package, which provides the EKF node.

sudo apt install ros-humble-robot-localization

If you’re working from source or want full access to configuration, clone it from the GitHub repo:

git clone https://github.com/cra-ros-pkg/robot_localization.git
cd robot_localization && colcon build

For comprehensive details on configuring and utilizing the robot_localization package, refer to the official ROS documentation.

4. Configure the EKF Node for ROS2 Sensor Fusion

The ekf_node is the centerpiece of ROS2 sensor fusion. It blends data from IMU and wheel encoders to produce an accurate robot state estimate.

Basic ekf.yaml Sample:

ekf_filter_node:
  ros__parameters:
    frequency: 50
    sensor_timeout: 0.1
    two_d_mode: true
    publish_tf: true
    map_frame: map
    odom_frame: odom
    base_link_frame: base_link
    world_frame: odom

    odometry0: /wheel_odom
    odometry0_config: [true, true, false,
                      false, false, false,
                      false, false, false,
                      false, false, false,
                      false, false, false]

    imu0: /imu/data
    imu0_config: [false, false, false,
                 true, true, true,
                 false, false, false,
                 false, false, false,
                 false, false, false]

Adjust the odom_frame, base_link_frame, and input topics to match your specific configuration.

Launch File:

ros2 run robot_localization ekf_node --ros-args --params-file ekf.yaml

You can now verify output on the /odometry/filtered topic:

ros2 topic echo /odometry/filtered

This fused odometry combines encoder and IMU inputs, minimizing IMU drift and slippage issues.

5. Add AMCL for Accurate Localization in a Known Map

Where EKF handles sensor fusion, AMCL (Adaptive Monte Carlo Localization) aligns your robot with a pre-built map, using particle filters and real-time laser scans. It continuously adjusts your robot’s pose against a known occupancy grid.

To use AMCL, make sure your map is ready (e.g., from SLAM Toolbox), then configure the following launch command:

ros2 launch nav2_bringup localization_launch.py 
map:=/path/to/map.yaml

Note: AMCL does not generate odometry—it refines your pose estimate within a map using LiDAR data. It requires /scan data and existing tf transforms between map → odom → base_link.

6. Visualizing the Fusion in RViz

To observe everything in real-time:

  • Launch RViz:
rviz2
  • Add displays for:
    • LaserScan (to view LiDAR)
    • Odometry (to see /odometry/filtered)
    • TF (to track frame alignment)
    • Pose (to see AMCL pose)

Watch how the filtered odometry remains stable even as IMU and encoders drift independently. With AMCL running, your robot adjusts its global location based on scan matches to the known map.

7. Tuning EKF and AMCL Parameters

To get the most from ROS2 sensor fusion, tuning is essential. Here’s what to watch:

For EKF:

  • sensor_timeout too low? Data drops will occur.
  • frequency too high? High CPU load.
  • Enable only the axes you trust for each sensor.

For AMCL:

  • Increase laser_max_beams for better accuracy (at cost of performance)
  • Tweak min_particles and max_particles to find performance balance
  • Adjust update_min_d and update_min_a to prevent over-updating pose

A complete list of parameters is available in the AMCL configuration guide by ROS Navigation

8. Real-World Test: Watch the Drift Shrink

Try this validation sequence:

  1. Drive your robot manually using teleop_twist_keyboard.
  2. Observe /wheel_odom drifting slightly in open space.
  3. Enable IMU fusion; drift slows.
  4. Turn on AMCL; position jumps back into place when scan match hits.

This is the real power of ROS2 sensor fusion. Each source informs and corrects the others.

9. Common Pitfalls & How to Fix Them

  • IMU Drift Persists? Double-check that your IMU’s covariance settings are realistic. Excessive trust in a cheap sensor will backfire.
  • Robot Teleports in RViz? Your TF tree might be broken. Confirm transforms from map → odom → base_link are broadcasting correctly.
  • Odometry Freezes? Sensor topics might be throttling or missing timestamps.

These debugging steps will help ensure that your ROS2 sensor fusion system behaves as expected.

10. Wrapping It Up: Toward Reliable Robot Localization

By blending encoder and IMU data through EKF and refining global pose with AMCL, you’re achieving what every autonomous robot needs: consistent, repeatable localization.

This is the foundation for autonomous navigation, path planning, and real-world robotic tasks. Without fused localization, your robot is just guessing. With it, it knows exactly where it is—and where it’s going next.

Next Steps: Planning Your First Autonomous Path

With accurate localization in place, the next step is full path planning and obstacle avoidance. You’ll learn how to use Nav2 to compute safe routes across your map using your newly stable pose.

Stay tuned for the next post in our series. In the meantime, explore more tutorials and hands-on builds at Robotisim, where we break down ROS2 development for Raspberry Pi mapping robots and beyond.

Leave a Comment

Your email address will not be published. Required fields are marked *

Review Your Cart
0
Add Coupon Code
Subtotal

 
Scroll to Top