Advanced Robotics

Module aims

In this module you will have the opportunity to:

  • address topics of advanced robotics, with a focus on real-time state estimation and mapping
  • see how robotics research can be applied to drones and Augmented and Virtual Reality
  • learn about extended challenges including 6D motion estimation and control, focusing on the camera as a core sensor
  • discuss fusion with complementary sensors such as Inertial Measurement Units, which have recently become very popular
  • be provided with the understanding, mathematical tools, and practical experience that allow you to implement your own multi-sensor Simultaneous Localisation And Mapping (SLAM) algorithms for deployment on a broad range of mobile robots, drones in particular

This modules builds on previous Robotics modules. That knowledge is assumed.

Learning outcomes

Upon successful completion of this module you will be able to:

  • describe the software components of a typical mobile robot, and their interactions with hardware (sensors, motors)
  • describe and evaluate multi-sensor Simultaneous Localisation and Mapping (SLAM) systems
  • model the kinematics and dynamics of wheeled and flying robots
  • model multi-sensor estimators with sparse and dense map representations
  • model different feedback-control approaches for robots and evaluate the differences
  • implement and critically evaluate estimators and feedback controllers running in real-time

Module syllabus

  • Introduction, Problem Formulation and Examples
  • Representations and Sensors
  • Kinematics and Temporal Models
  • The Extended Kalman Filter in Practice
  • Feedback Control
  • Nonlinear Least Squares
  • Vision-Based Simultaneous Localisation and Mapping

Teaching methods

The taught lectures, comprising 50% of contact time, introduce the mathematical and algorithmic foundations of estimation, modeling and control of (mobile) robots. Example problems with interactive solution are interleaved with taught content. In the practical sessions, i.e. the other 50% of contact time, students learn to apply the material to an autonomous multicopter drone. Student groups implement software in modern C++ and integrated with the Robot Operating System (ROS) middleware, such that the drone navigates autonomously with on-board vision-based state estimation and control such that a simple delivery task can be achieved reliably, accurately, and fast.

The Piazza Q&A web service will be used as an open online discussion forum for the module.

Assessments

The coursework, i.e. drone practicals, count 20% of the mark for the module. It is broken down into several milestone submissions of the code that will be automatically tested, as well as assessed demonstrations marked by the TAs. There will be a final written exam, testing both practical and theoretical aspects, which counts for the remaining 80% of the marks.
                                   
Concerning the practicals, students obtain feedback through our automatic code testing infrastructure: partly on a regular basis by means of public tests, as well as after submission of assessed milestones through the tests that are hidden to them, but which provide useful feedback. Common pitfalls will be pointed out in e-mails and in the lectures.

Module leaders

Dr Stefan Leutenegger