Event image

ABSTRACT: In this talk I will present an asymmetric framework for
real-time dense localisation and mapping. In a first part it will be
shown how large-scale dense photometric models are acquired using
RGB-D sensors including both multi-camera and Kinect devices. Several different multi-camera systems for automatically acquiring the 3D spherical photometric model with depth will be presented.

In a second part it will be shown how different sensors may then be
used with these prior models to perform robust localisation in dynamic
environments including a monocular, stereo or Kinect sensor. The
proposed approach to handling dynamic changes in the scene involves
combining the prior dense photometric model with online visual
odometry. In particular it will be shown how the technique takes into
account large illumination variations and subsequently improves direct
techniques which are intrinsically prone to illumination change. This
is achieved by exploiting the relative advantages of both model-based
and visual odometry techniques for tracking. Direct 6 dof tracking is
performed by an accurate method, which directly minimizes dense image measurements iteratively, using non-linear optimisation.

If time allows I will also present a generalised imaging model for
visual servoing which allows to combine monocular cameras,
multi-camera systems and non-central projection cameras into a same
framework. In this case cameras are modelled as sets of 3D viewing
rays. This model leads to a new generalised visual servoing control
formalism that can be applied to any type of imaging system whether it
be multi-camera, catadioptric, non-central, etc. An example will be
presented with a pair of non-overlapping cameras.

Biography: Dr. Andrew Comport (http://www.i3s.unice.fr/~comport) is
“Chargé de recherche premiere classe (CR1)” (tenure researcher) with
the “Centre National de Recherch Scientifique” (http://www.cnrs.fr/).
He is also associate director of the “Signal, Images, Systems”
department of the I3S Laboratory (http://www.i3s.unice.fr/) at the
University of Nice Sophia-Antipolis. From 2007 to 2009 he worked with
the ROSACE team at LASMEA at the University of Blaise Pascal. Up until
the end of 2007 he carried out a post-doc position in the AROBAS
project at INRIA Sophia-Antipolis. This post-doc project was focused
on autonomous visual navigation of a mobile vehicle and is funded by
the French national project MOBIVIP. In 2005 he completed a PhD on
robust real-time 3D tracking of rigid and articulated objects in the
Lagadic project at IRISA/INRIA in Rennes. In 2001 he worked as a
Research Assistant for Ray Jarvis in the Intelligent Robotics Research
Center(IRRC). In 2000 he graduated with a Bachelor of Engineering (BE) majoring in Electrical and Computer Systems Engineering with Honours from Monash University Australia. In 1997 he graduated with Bachelor of Science (BSc) majoring in Computer Science also from Monash University, Australia.