Imperial College London

Professor Yiannis Demiris

Faculty of EngineeringDepartment of Electrical and Electronic Engineering

Professor of Human-Centred Robotics, Head of ISN



+44 (0)20 7594 6300y.demiris Website




1014Electrical EngineeringSouth Kensington Campus






BibTex format

author = {Sarabia, M and Lee, K and Demiris, Y},
doi = {10.1109/ROMAN.2015.7333649},
pages = {715--721},
publisher = {IEEE},
title = {Towards a Synchronised Grammars Framework for Adaptive Musical Human-Robot Collaboration},
url = {},
year = {2015}

RIS format (EndNote, RefMan)

AB - We present an adaptive musical collaboration framework for interaction between a human and a robot. The aim of our work is to develop a system that receives feedback from the user in real time and learns the music progression style of the user over time. To tackle this problem, we represent a song as a hierarchically structured sequence of music primitives. By exploiting the sequential constraints of these primitives inferred from the structural information combined with user feedback, we show that a robot can play music in accordance with the user’s anticipated actions. We use Stochastic Context-Free Grammars augmented with the knowledge of the learnt user’s preferences.We provide synthetic experiments as well as a pilot study with a Baxter robot and a tangible music table. The synthetic results show the synchronisation and adaptivity features of our framework and the pilot study suggest these are applicable to create an effective musical collaboration experience.
AU - Sarabia,M
AU - Lee,K
AU - Demiris,Y
DO - 10.1109/ROMAN.2015.7333649
EP - 721
PY - 2015///
SP - 715
TI - Towards a Synchronised Grammars Framework for Adaptive Musical Human-Robot Collaboration
UR -
UR -
ER -