DeepFactors
A real-time dense visual SLAM system capable of capturing comprehensive dense keyframe maps of room scale environments using an RGB camera.

The ability to estimate rich geometry and camera motion from monocular imagery is fundamental to future interactive robotics and augmented reality applications. Different approaches have been proposed that vary in scene geometry representation (sparse landmarks, dense maps), the consistency metric used for optimising the multi-view problem, and the use of learned priors. We present a SLAM system that unifies these methods in a probabilistic framework while still maintaining real-time performance. This is achieved through the use of a learned compact depth map representation and reformulating three different types of errors: photometric, reprojection and geometric, which we make use of within standard factor graph software. We evaluate our system on trajectory estimation and depth reconstruction on real-world sequences and present various examples of estimated dense geometry.

 

Jan Czarnowski, Tristan Laidlow, Ronald Clark, Andrew J. Davison. DeepFactors: Real-Time Probabilistic Dense Monocular SLAM. IEEE Robotics and Automation Letters (RA-L), 2020


The DeepFactors software is available through the link on the right and is free to be used for non-commercial purposes.  Full terms and conditions which govern its use are detailed here.