Vision-based Simultaneous Localisation and Mapping (SLAM) is an AI technology that enables devices to map the general unstructured world using low-cost cameras and efficient on-board processing.

An Imperial team, led by Professor Andrew Davison and Dr Stefan Leutenegger in the Department of Computing, has made a sequence of highly influential advances in SLAM algorithms: (i) drift-free long-term 3D localisation; (ii) detailed scene reconstruction; and (iii) semantic mapping to localise objects.

These new algorithms have been used as key features in Dyson’s first ever robotic products, the Dyson 360 Eye and Heurist robot vacuum cleaners. Imperial startup SLAMcore is commercialising a broad range of other applications in commercial and consumer robotics. SLAM is also used in tracking and mapping for virtual and augmented reality, and Imperial’s algorithms have contributed to Microsoft’s Kinect and Hololens products as well as systems at Meta/Oculus via the acquisition of startup Surreal Vision.