Foundational Lectures

Fundamentals of Kinematics, Dynamics & Actuation

Prof Ferdinando Rodriguez y Baena & Prof Thrishantha Nanayakkara

Fundamentals of Interaction, Sensing & Control

Dr Ad Spiers, Dr Enrico Franco & Prof Etienne Burdet

Fundamentals of Robot Learning

Dr Edward Johns & Dr Antoine Cully

Schedule

Group Projects

Adaptive & Learning Lab

Learning to Walk; Developing Intelligent Locomotion for Quadruped Robots

This project uses learning algorithms to generate walking gaits for quadruped robots. Participants will apply techniques like reinforcement learning or evolutionary strategies to train a simulated robot to walk, then transfer these gaits to a physical robot, addressing the simulation-to-reality gap. Success will be measured by the robot’s walking stability, speed, and adaptability to various terrains in both simulation and real-world tests.

Essential Skills

  • Programming skills in Python; Machine Learning experience.

Desirable Skills

  • Programming skills in Jax; Experience with physical simulators (Mujoco) and MJX preferably; Experience with doing experiments with physical systems.

Learning Outcomes

  • Hand-on experience on deploying machine learning algorithm on robots
  • Running large scale simulations and machine learning methods
  • Evaluating outputs and identifying bottleneck in robotics applications

Biomechatronics Lab

Robotic Conversational AI Engagement for Cognitive Disorders

As the global population ages, there is increasing need for accessible technologies that promote cognitive health and detect early signs of cognitive decline. This project will involve the design and implementation of verbal human-robot interaction and machine learning analysis of speech to explore associations with cognitive function.

Essential Skills

Collectively as a team: Python, machine learning and NLP toolkits (e.g., sklearn, nltk).

Desirable Skills

Knowledge of socially assistive robotics, familiarity with LLMs, prompt engineering and API integration. Interest in robotics, AI, human-robot interaction, cognitive health, neurodegenerative diseases.

Learning Outcomes

• Design LLM-powered interaction for verbal human-robot engagement in a structured cognitive task (based on clinically validated cognitive tools)
• Program and integrate robot idle behaviours using a 3D-printed social robot for multimodal human-robot engagement
• Implement a machine learning pipeline to perform predictive analysis of cognitive state from speech and language features
• Discuss potential of conversational AI for accessible cognitive support and early screening in real-world home settings.

Human Robotics Group

Understanding Artificial Haptic Sensing

Essential Skills

  • Basic proficiency in Python programming
  • Understanding of linear algebra
  • Interest in haptic perception and tactile sensing

Desirable Skills and Experience

  • Fundamentals of computer vision
  • Introductory knowledge of machine learning
  • Signal processing techniques

Learning Outcomes

  • Hands-on experience with artificial haptic sensors and multimodal tactile data
  • Understanding of the importance of experimental design in haptic sensing
  • Practical skills in signal processing and interpreting tactile responses
  • Foundation to design and develop artificial haptic perception setups

Insect Sensorimotor Control Lab

Adaptive feedforward control of a drone

Like insects, drones are expected to carry loads of different sizes and weights, cope with damage, and a range of other changes to their dynamics. A hypothesised tactic employed by insects is to cope with these changes to the system dynamics using an adaptive feedforward control scheme. This project will involve the development, simulation, and implementation of an adaptive control method on a Parrot Mambo minidrone to improve the reference tracking.

Essential Skills

  • Fundamental understanding of control systems; Knowledge and experience using Simulink; MATLAB.

Learning Outcomes

  • Practical deployment of a custom control system onto a drone.
  • Improve skills using Simulink for control system design.
  • Test a novel adaptive control method.
  • Using a Hardware to Simulink to MATLAB data recording system to analyse controller performance.

Insect Sensorimotor Control Lab

Biomimetic collision avoidance on a two-wheeled robot using vision

Essential Skills

  • Prototyping (soldering, mechanical assembling/testing)
  • Python (preferably C/C++) programming under Linux OS
  • Image processing
  • Mathematical modelling
  • Simple robotic control

Learning Outcomes

  • Assembling a two-wheeled robot including a vision sensor (camera), processors and motors.
  • Improved programming skills under Python and testing robot dynamics.
  • Implementation of a custom-built control system on a 2-wheeled robot.
  • Development of a closed-loop control algorithm for vision-based collision avoidance.

Manipulation and Touch Lab

Human Inspired Robot Reaching Motions

For robots to be integrated into human environments, their motion should be predictable and human-like, even if the robot itself is non-anthropomorphic. 
In this project you will use the ROAG dataset of 2.5K human reaching motions to determine a reaching controller for a desktop robot manipulator (a Robotis Manipulator X).

Essential Skills

Some experience in machine learning, MATLAB (though ROAG dataset can be ported to another language if needed)

Learning Outcomes

  • Understanding human reaching motion
  • Creating generalisable motion controllers
  • Programming robotic manipulator hardware

Projects

Mechatronics in Medicine

Soft inchworm robot for endoscopy

The projects aims to design and manufacture a soft inchworm robot intended to navigate a surgical phantom of the colon.

Essential Skills

Collectively as a team:

  • Matlab; Programming in C; Microcontrollers, Arduino; Mechanical design - SolidWorks

Desirable Skills

  • Experience with soft robotics; Hands on experience with mechatronic systems

Learning Outcomes

  • Design and manufacture of a soft robot
  • Integrate sensing and actuation
  • Test in a surgical phantom

Morph Lab

Knee exosuit with angle dependent damping

Recent evidence shows that angle dependent impedance at the knee joint helps to reduce collision forces during walking as well as reducing their variability. For instance, a bicycle rider will come off the seat if the terrain becomes bumpy. Riding with bent knees helps stabilise the head.

In this project, you will design a wearable knee exo-suit with elastic material to provide an angle dependent impedance. You will test it on a pivot joint connecting two rigid links. You will compare results of accelerations in one link due to perturbations gives at the other rigid link with and without the angle dependent impedance.

Essential Skills

  • Hardware design, fabrication, and testing; Sensor interfacing and filtering.

Desirable Skills

  • Optimal state estimation and closed loop control.

Learning Outcomes

  • You will learn how to articulate a testable hypothesis in robotics.
  • You will learn how to design an experimental method and implement it.
  • You will learn how to analyse experimental data and do design iterations to improve the design.
  • Optionally, you will learn how to extend a prototype to a wearable solution considering user centric form and function criteria.

Personal Robotics Lab

Robot Social Navigation: moving safely among people

As robots become more common, they must navigate safely around people—for example, as guides in airports or smart wheelchairs in crowded areas. Simply stopping whenever someone enters sensor range leads to unnatural, inefficient behavior that doesn't build human trust.

In this project, you'll implement a social navigation algorithm for a wheeled or legged robot. Using onboard sensors (cameras/lidars), the robot will detect and track people, predict their paths, and plan its own trajectory accordingly. You'll explore various planning methods, such as social potential fields, and test different algorithms on the robot.

Essential Skills

Strong software engineering skills (e.g. Python/C/C++/Docker under Linux); participants are advised to install Linux and practice developing software.

Desirable Skills

Experience with ROS and computer vision/robotics tools such as OpenCV, YOLO, MoveIt; open tutorials for advance study 

Learning Outcomes

  • Using onboard sensors to detect people 
  • Using people's location and trajectory information to determine future positions
  • Plan a safe human-acceptable trajectory 
  • Performing robot experiments that involve people

Robot Learning Lab

Reinforcement Learning for Object Grasping

Robot Learning algorithms, such as Reinforcement Learning, enable robots to learn tasks by collecting data and training control policies on that data. In this project, you will study how Reinforcement Learning can be used to train a robot to grasp objects.

First, you will use a robot simulator to train a basic Reinforcement Learning algorithm across a range of objects, with the policy taking as input an image observed from a camera, and predicting as output the target grasp pose. Then, you will transfer this policy from simulation to reality using domain randomisation techniques, and evaluate the performance on one of the robots in our lab.

Essential Skills

Python; Torch

Desirable Skills

Programming robot arms with basic end-effector position control. Knowledge of basic reinforcement learning concepts, such as states, actions, and rewards. Knowledge of basic computer vision algorithms, such as object detection and object segmentation.

Learning Outcomes

  • Training neural networks in Torch
  • Programming robot arms in Python
  • Developing reinforcement learning, computer vision, and sim-to-real algorithms.
  • Evaluating robot control policies on real-world tasks.

Soft Robotic Transducers Lab

Beating pulse simulator using soft Electrohydraulic actuation

Students taking on this project should be comfortable with flexible thin film fabrication and patterning, programming an XYZ motion system (e.g. 3D printers) to produce complex channel geometries. Experience with embedded systems (e.g., Simulink or Arduino firmware) for real-time pulse shaping and data acquisition, together with Python or MATLAB for signal processing, is highly advantageous. A background across mechanical, electrical, and biomedical principles is also helpful for integrating and validating the complete system.

Learning Outcomes

  • Understand electro-hydraulic actuation system, linking electrostatic actuation, structure compliance and fluid movement.
  • Hands-on experience on fabricating soft electrohydraulic actuators with XYZ motion system, applying G-code programming, and conductive patterning.
  • Develop and characterise pulse waves, quantifying frequency, amplitude, and waveform against physiological targets.
  • Demonstrate effective teamwork, collaborating respectfully and responsibly to solve problems and complete technical tasks.

Soft Robotics and Applied Control (SRAC) Lab

Small-scale mobile soft robot

This project aims to develop a small-scale mobile soft robot using inflatable balloons and miniature pumps.

The robot is intended for exploration of rough terrain.

Essential Skills

Collectively as a team:

  • Matlab; Programming in C; Microcontrollers such as Arduino; Familiarity with CAD such as SolidWorks

Desirable Skills

  • Experience with soft robotics

Learning Outcomes

  • Ability to design and manufacture a fully-functional soft robotic system.
  • Hands on experience in integrating sensing and actuation.
  • Ability to design and implement a simple control algorithm.

Transport Systems & Logistics Lab

Enhanced Connected and Autonomous Vehicle Coordination for Indoor Navigation

This project develops a coordination framework to enhance safety and efficiency in Connected and Autonomous Vehicles (CAVs) using real-time communication and cooperative control. DuckieBots—sensor-equipped autonomous robots—will serve as the test platform, building on TSL’s control framework.

Students will create software to process sensor data, detect obstacles, plan safe paths, and coordinate movement. At intersections, DuckieBots will share arrival times and assign priorities to avoid collisions and improve flow. Performance will be evaluated through motion data to guide future improvements in cooperative driving.

Essential Skills

Python, ROS

Desirable Skills

Reinforcement learning, Model Predictive Control, computer vision, sensor fusion

Learning Outcomes

  • Hands-on experience in miniature autonomous vehicles
  • Development of planning and control algorithms
  • Process sensor data and fusion