Imperial College London

ProfessorAndrewDavison

Faculty of EngineeringDepartment of Computing

Professor of Robot Vision
 
 
 
//

Contact

 

+44 (0)20 7594 8316a.davison Website

 
 
//

Assistant

 

Ms Lucy Atthis +44 (0)20 7594 8259

 
//

Location

 

303William Penney LaboratorySouth Kensington Campus

//

Summary

 

Publications

Citation

BibTex format

@inproceedings{Haughton:2023,
author = {Haughton, I and Sucar, E and Mouton, A and Johns, E and Davison, AJ},
pages = {118--127},
title = {Real-time mapping of physical scene properties with an autonomous robot experimenter},
url = {http://hdl.handle.net/10044/1/106523},
year = {2023}
}

RIS format (EndNote, RefMan)

TY  - CPAPER
AB - Neural fields can be trained from scratch to represent the shape and appearance of 3D scenes efficiently. It has also been shown that they can densely map correlated properties such as semantics, via sparse interactions from a human labeller. In this work, we show that a robot can densely annotate a scene with arbitrary discrete or continuous physical properties via its own fully-autonomous experimental interactions, as it simultaneously scans and maps it with an RGB-D camera. A variety of scene interactions are possible, including poking with force sensing to determine rigidity, measuring local material type with single-pixel spectroscopy or predicting force distributions by pushing. Sparse experimental interactions are guided by entropy to enable high efficiency, with tabletop scene properties densely mapped from scratch in a few minutes from a few tens of interactions.
AU - Haughton,I
AU - Sucar,E
AU - Mouton,A
AU - Johns,E
AU - Davison,AJ
EP - 127
PY - 2023///
SP - 118
TI - Real-time mapping of physical scene properties with an autonomous robot experimenter
UR - http://hdl.handle.net/10044/1/106523
ER -