Yiannis Demiris is a Professor in Human-Centred Robotics at Imperial, where he holds a Royal Academy of Engineering Chair in Emerging Technologies (Personal Assistive Robotics). He established the Personal Robotics Laboratory at Imperial in 2001. He holds a PhD in Intelligent Robotics, and a BSc(Hons) in Artificial Intelligence and Computer Science, both from the University of Edinburgh. He has been a European Science Foundation (ESF) junior scientist Fellow, and a COE Fellow at the Agency of Industrial Science and Technology (AIST - ETL) of Japan. He is currently a Fellow of the Institute of Engineering and Technology (FIET), and a Fellow of the British Computer Society (FBCS).
Prof. Demiris' research interests include Artificial Intelligence, Machine Learning, and Intelligent Robotics, particularly in intelligent perception, multi-scale user modelling, and adaptive cognitive control architectures in order to determine how intelligent robots can generate personalised assistance to humans in order to improve their physical, cognitive and social well being. He participates in multiple national and international research projects at the interface between the theory and application of interactive learning systems. A major current line of research is centred around the Royal Academy of Engineering Chair in Emerging Technologies (Personal Assistive Robotics, 2019-2029), where he researches AI and machine learning methods for long-term modelling of combined human physical, physiological and cognitive states to allow smart robots to assist humans to achieve their full potential. With multiple international partners (Berkeley, CSHL, Duke, Harvard, NYU and USC in the USA, and UCL & Essex in the UK) within the MURI USA/UK research project "Multimodal Brain Machine Interfaces for Enhancing Decision Accuracy", he is researching the use of human-centred information processing from cameras and wearable sensors to detect human states during driving, in order to intelligent control the visualisation of contextual information to assist the human driver. With multiple European partners (Aalto (Finland), CTU (Czech Republic), UPC (Catalunya), U Bordeaux (France)), he is also researching the use of multimodal information (visual, tactile, and text) to build rich representations of complex (e.g. articulated) objects to assist their intelligent manipulation by robots. With multiple UK collaborators (Herriot-Watt University and University of Manchester) he collaborates through the "UKRI Trustworthy Autonomous Systems Node on Trust" project on machine learning algorithms for acquiring, maintaining and repairing human trust to robotic assistive systems. Finally, within the InnovateUK "D-RISK" project, he collaborates with colleagues from Imperial's Transport Systems Laboratory, DRISK.AI, Claytex Services, Transport for London, and DG Cities, to research how computer vision and machine learning can be used to enhance the perception and handling of "edge-cases" in autonomous driving. In addition to the above, he collaborates widely with academic, clinical and commercial partners to bring AI and Intelligent Robotics to impactful applications.
His teaching includes (a) human-centered robotics, a 4th year research-led module on how to design and build an interactive robotic system, and evaluate it with real users, and (b) Mobile Healthcare and Machine Learning, a 4th year research-led module on how to design and implement mobile healthcare systems that personalise their behaviour using data about their users. He received the 2012 Rector's Award for Teaching Excellence, and the 2012 FoE award for excellence in Engineering Education. He has been nominated for the Student Academic Choice Awards (Innovation in Teaching) multiple times (2016, 2017 and 2019).
For up-to-date information, readers are advised to look at the Personal Robotics laboratory webpage at http://www.imperial.ac.uk/PersonalRobotics where information is more frequently updated by my students and postdocs.
Chacon Quesada R, Demiris Y, 2022, Proactive Robot Assistance: Affordance-Aware Augmented Reality User Interfaces, Ieee Robotics & Automation Magazine, ISSN:1070-9932, Pages:2-14
Amadori P, Fischer T, Demiris Y, 2021, HammerDrive: A task-aware driving visual attention model, Ieee Transactions on Intelligent Transportation Systems, ISSN:1524-9050, Pages:1-13
et al., 2021, Haptic and Visual Feedback Assistance for Dual-Arm Robot Teleoperation in Surface Conditioning Tasks, Ieee Transactions on Haptics, Vol:14, ISSN:1939-1412, Pages:44-56