Imperial College London


Faculty of EngineeringDepartment of Computing

Senior Lecturer



+44 (0)20 7594 7123s.leutenegger Website




360ACE ExtensionSouth Kensington Campus






BibTex format

author = {Whelan, T and Salas-Moreno, RF and Glocker, B and Davison, AJ and Leutenegger, S},
doi = {10.1177/0278364916669237},
journal = {International Journal of Robotics Research},
pages = {1697--1716},
title = {ElasticFusion: real-time dense SLAM and light source estimation},
url = {},
volume = {35},
year = {2016}

RIS format (EndNote, RefMan)

AB - We present a novel approach to real-time dense visual SLAM. Our system is capable of capturing comprehensive dense globallyconsistent surfel-based maps of room scale environments and beyond explored using an RGB-D camera in an incrementalonline fashion, without pose graph optimisation or any post-processing steps. This is accomplished by using dense frame-tomodelcamera tracking and windowed surfel-based fusion coupled with frequent model refinement through non-rigid surfacedeformations. Our approach applies local model-to-model surface loop closure optimisations as often as possible to stay closeto the mode of the map distribution, while utilising global loop closure to recover from arbitrary drift and maintain global consistency.In the spirit of improving map quality as well as tracking accuracy and robustness, we furthermore explore a novelapproach to real-time discrete light source detection. This technique is capable of detecting numerous light sources in indoorenvironments in real-time as a user handheld camera explores the scene. Absolutely no prior information about the scene ornumber of light sources is required. By making a small set of simple assumptions about the appearance properties of the sceneour method can incrementally estimate both the quantity and location of multiple light sources in the environment in an onlinefashion. Our results demonstrate that our technique functions well in many different environments and lighting configurations.We show that this enables (a) more realistic augmented reality (AR) rendering; (b) a richer understanding of the scene beyondpure geometry and; (c) more accurate and robust photometric tracking
AU - Whelan,T
AU - Salas-Moreno,RF
AU - Glocker,B
AU - Davison,AJ
AU - Leutenegger,S
DO - 10.1177/0278364916669237
EP - 1716
PY - 2016///
SN - 1741-3176
SP - 1697
TI - ElasticFusion: real-time dense SLAM and light source estimation
T2 - International Journal of Robotics Research
UR -
UR -
VL - 35
ER -