General Research Themes

The Dyson Robotics Laboratory at Imperial College is focussed on pushing forwards the frontier of real-time 3D computer vision and perception technology, to enable the next generation of smart real-world robots for the home and beyond.

Click here for an overview of computer vision for robotics by Professor Andrew Davison.

Our research is organised under two main themes:

Our research

Pic1

Scene and Object Understanding from a Visual Robot

How can we use computer vision to enable a mobile robot, equipped with a camera, to actively explore a scene and understand its surroundings in usable 3D detail?

A mobile robot exploring a novel environment should make use of visual sensors to model, recognise and interpret its surroundings, but do this by taking advantage of its ability to move actively through the scene and control its viewpoint.

Current state-of-the-art solutions suitable for commodity robot platforms are limited to building sparse point feature maps. Meanwhile, current academic research is pointing towards real-time dense surface reconstruction and semantically labelled "object level" representations.

Along similar lines, this theme looks to expand the frontiers of practical visual SLAM as it evolves towards general real-time scene understanding. We address challenging real-world situations such as large indoor or outdoor environments, varying lighting conditions, low cost cameras, and fast movement. 

Pic2

Visual Manipulation


How can we use computer vision to enable a robotic arm to explore, interact with, and manipulate objects in a scene?

When robots have the capability not just to move through a scene but actually to manipulate and modify it, a step change arises in the applications possible in home and service robotics.

The state-of-the-art in real-world mobile robot manipulation is severely limited by the lack of integrated research on advanced 3D sensing coupled with real robot platforms, which would allow movement while pushing, pulling and picking up objects in a real scene.

Vision enables manipulation; but also, manipulation benefits scene understanding, as objects are poked or peeled away to clarify or reveal their contexts, often making the visual inference task easier.

This theme addresses the software and hardware of integrated robot platforms, with arms and end effectors which are able to interact precisely with surfaces and objects in a scene. We are prototyping the capabilities needed for real-world service robots, which can use hands and tools to interact with a scene in human-like ways to perform a wide variety of tasks.

Contact us

Dyson Robotics Lab at Imperial
William Penney Building
Imperial College London
South Kensington Campus
London
SW7 2AZ

Telephone: +44 (0)20 7594-7756
Email: iosifina.pournara@imperial.ac.uk