Imperial College London


Faculty of EngineeringDepartment of Computing

Senior Lecturer



+44 (0)20 7594 7123s.leutenegger Website




360ACE ExtensionSouth Kensington Campus






BibTex format

author = {Li, W and Saeedi, Gharahbolagh S and McCormac, J and Clark, R and Tzoumanikas, D and Ye, Q and Tang, R and Leutenegger, S},
publisher = {BMVC},
title = {InteriorNet: Mega-scale Multi-sensor Photo-realistic Indoor Scenes Dataset},
url = {},
year = {2018}

RIS format (EndNote, RefMan)

AB - Datasets have gained an enormous amount of popularity in the computer vision com-munity, from training and evaluation of Deep Learning-based methods to benchmarkingSimultaneous Localization and Mapping (SLAM). Without a doubt, synthetic imagerybears a vast potential due to scalability in terms of amounts of data obtainable withouttedious manual ground truth annotations or measurements. Here, we present a datasetwith the aim of providing a higher degree of photo-realism, larger scale, more variabil-ity as well as serving a wider range of purposes compared to existing datasets. Ourdataset leverages the availability of millions of professional interior designs and millionsof production-level furniture and object assets – all coming with fine geometric detailsand high-resolution texture. We render high-resolution and high frame-rate video se-quences following realistic trajectories while supporting various camera types as well asproviding inertial measurements. Together with the release of the dataset, we will makeexecutable program of our interactive simulator software as well as our renderer avail-able at To showcase the usabilityand uniqueness of our dataset, we show benchmarking results of both sparse and denseSLAM algorithms.
AU - Li,W
AU - Saeedi,Gharahbolagh S
AU - McCormac,J
AU - Clark,R
AU - Tzoumanikas,D
AU - Ye,Q
AU - Tang,R
AU - Leutenegger,S
PY - 2018///
TI - InteriorNet: Mega-scale Multi-sensor Photo-realistic Indoor Scenes Dataset
UR -
ER -