Push the Limits of Localization

The SLAM Benchmark for Challenging Environments

A continuous, server-side evaluation platform designed to test the resilience of modern SLAM systems against real-world, perceptually-degraded data. This benchmark is part of the ECMR 2025 Workshop on "SLAM in Challenging Environments".

View All Competitions

Why Participate?

A New Standard for Robustness

Go beyond clean, structured environments. We provide a benchmark with multi-modal datasets (degenerate LiDAR, forest Radar, multi-spectral) designed to break conventional algorithms and foster innovation.

True System-Level Evaluation

Move beyond trajectory submissions. Our platform evaluates your entire Dockerized SLAM system on our private test servers, providing a fair assessment of your algorithm's true performance.

A Hub for Degenerate SLAM

This is a rallying point for researchers focused on localization under ambiguity. Join a community dedicated to creating the next generation of truly challenge-aware navigation systems.

How It Works

A Living Benchmark

The August 28, 2025 deadline is for the competition associated with the workshop. However, the platform is a living benchmark and will remain open for submissions indefinitely, allowing you to track progress against new datasets and methods over time.

Open and Accessible to All

Participation is free and open to everyone. You do not need to register for the ECMR conference. All workshop competition participants will be offered an opportunity to present their system to the community online.

Evaluation Metrics

Submissions are ranked based on standard metrics, including Absolute Trajectory Error (ATE) and Relative Pose Error (RPE).

Available Datasets

The following competitions are currently open for submission. Each contains unique datasets for training, while the private evaluation data includes larger loops and parts not seen in the public training sets.

Datasets Sneak-Peek

Sneak-Peek: 3D LiDAR Degenerate - Fields

🤖 Check out a video teaser for one of our four available datasets, specifically the 3D LiDAR Degenerate: Fields track called shellby-0225-test-loop.

Sneak-Peek: Radar Dataset - Viking Hill

Dataset Description

This track, recorded in a forested area at Örebro University, captures the challenging Enbuskabacken "Viking Hill" terrain during peak summer vegetation. The uneven ground, which served as a Viking Age burial site, provides a unique test for SLAM algorithms.

Key Sensors

  • Ouster OS0-32 (3D LiDAR)
  • Sensrad Hugin A3-Sample (4D Radar)
  • IDS Imaging uEye camera
  • Xsens MTi-30 (IMU)
  • Emlid Reach RS2+ (RTK-GNSS)
Viking Hill Area
Robot in tall grass

Recorded in June, the grass was tall enough to often obscure the robot sensors.

Robot in dense vegetation

The robot was intentionally driven through bushes and over uneven terrain.

Video Teasers

Click on the links below to watch video previews of the Viking Hill dataset:

Contribute to the Challenge!

Have a dataset that pushes the boundaries of SLAM? We invite researchers to contribute their own challenging scenarios. Help grow the community's resources and see how other systems perform on your data.

View Contributor Guidelines
31
Users
13
Submissions
Acknowledgments

The organization and infrastructure support of the SLAM competition would not be possible without contributing CRL Lab members at the Czech Technical University in Prague. A huge thanks is going to Vsevolod Hulchuk, Rudolf Jakub Szadkowski, Jindřiška Deckerová, Jan Bayer, Martin Škarytka, and Tomáš Lapeš. We also extend our sincere gratitude for the dataset contributions from Vladimír Kubelka (Örebro University, Sweden) for the Radar Dataset, and from Matteo Luperto (University of Milano, Italy) and Dyuman Aditya (Ecole Centrale de Nantes, France) for the Quadruped Dataset. This work was supported by the EU funded project ROBOPROX - Robotics and advanced industrial production (reg. no. CZ.02.01.01/00/22_008/0004590). The work on the Radar Dataset was supported by the European Union's Horizon Europe Framework Programme under the RaCOON project (ID: 101106906).