We use perceptual methods, AI, and frugal robotics innovation to deliver transformative diagnostic and treatment solutions.

Head of Group

Dr George Mylonas

B415B Bessemer Building
South Kensington Campus

+44 (0)20 3312 5145

YouTube ⇒ HARMS Lab

What we do

The HARMS lab leverages perceptually enabled methodologies, artificial intelligence, and frugal innovation in robotics (such as soft surgical robots) to deliver transformative solutions for diagnosis and treatment. Our research is driven by both problem-solving and curiosity, aiming to build a comprehensive understanding of the actions, interactions, and reactions occurring in the operating room. We focus on using robotic technologies to facilitate procedures that are not yet widely adopted, particularly in endoluminal surgery, such as advanced treatments for gastrointestinal cancer.

Meet the team

No results found

Search or filter publications

Filter by type:

Filter by publication type

Filter by year:

to

Results

  • Showing results for:
  • Reset all filters

Search results

  • Conference paper
    Fathi J, Vrielink TJCO, Runciman MS, Mylonas GPet al., 2019,

    A Deployable Soft Robotic Arm with Stiffness Modulation for Assistive Living Applications

    , International Conference on Robotics and Automation (ICRA), Publisher: IEEE, Pages: 1479-1485, ISSN: 1050-4729
  • Conference paper
    Patel N, Kogkas A, Ben Glover AD, Mylonas Get al., 2019,

    EYE GAZE-CONTROLLED ROBOTIC FLEXIBLE ENDOSCOPY: A FEASIBILITY STUDY

    , Annual Meeting of the British-Society-of-Gastroenterology (BSG), Publisher: BMJ PUBLISHING GROUP, Pages: A38-A39, ISSN: 0017-5749
  • Conference paper
    Wang M-Y, Kogkas AA, Darzi A, Mylonas GPet al., 2019,

    Free-view, 3D gaze-guided, assistive robotic system for activities of daily living

    , 25th IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Publisher: IEEE, Pages: 2355-2361, ISSN: 2153-0858

    Patients suffering from quadriplegia have limited body motion which prevents them from performing daily activities. We have developed an assistive robotic system with an intuitive free-view gaze interface. The user's point of regard is estimated in 3D space while allowing free head movement and is combined with object recognition and trajectory planning. This framework allows the user to interact with objects using fixations. Two operational modes have been implemented to cater for different eventualities. The automatic mode performs a pre-defined task associated with a gaze-selected object, while the manual mode allows gaze control of the robot's end-effector position on the user's frame of reference. User studies reported effortless operation in automatic mode. A manual pick and place task achieved a success rate of 100% on the users' first attempt.

  • Conference paper
    Vrielink TJCO, Puyal JG-B, Kogkas A, Darzi A, Mylonas Get al., 2019,

    Intuitive gaze-control of a robotized flexible endoscope

    , 25th IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Publisher: IEEE, Pages: 1776-1782, ISSN: 2153-0858

    Flexible endoscopy is a routinely performed procedure that has predominantly remained unchanged for decades despite its many challenges. This paper introduces a novel, more intuitive and ergonomic platform that can be used with any flexible endoscope, allowing easier navigation and manipulation. A standard endoscope is robotized and a gaze control system based on eye-tracking is developed and implemented, allowing hands-free manipulation. The system characteristics and step response has been evaluated using visual servoing. Further, the robotized system has been compared with a manually controlled endoscope during a user study. The users (n=11) showed a preference for the gaze controlled endoscope and a lower task load when the task was performed with the gaze control. In addition, gaze control was related to a higher success rate and a lower time to perform the task. The results presented validate the system's technical performance and demonstrate the intuitiveness of hands-free gaze control in flexible endoscopy.

  • Conference paper
    Zhao M, Oude Vrielink J, Kogkas A, Mylonas G, Elson Det al., 2019,

    Prototype Designs of a Cable-driven Parallel Robot for Transoral Laser Surgery

    , Hamlyn Symposium on Medical Robotics
  • Book chapter
    Kogkas A, Ezzat A, Thakkar R, Darzi A, Mylonas Get al., 2019,

    Free-View, 3D Gaze-Guided Robotic Scrub Nurse

    , Editors: Shen, Liu, Peters, Staib, Essert, Zhou, Yap, Khan, Publisher: SPRINGER INTERNATIONAL PUBLISHING AG, Pages: 164-172, ISBN: 978-3-030-32253-3
  • Conference paper
    Achanccaray D, Mylonas G, Andreu-Perez J, 2019,

    An Implicit Brain Computer Interface Supported by Gaze Monitoring for Virtual Therapy

    , IEEE International Conference on Systems, Man and Cybernetics (SMC), Publisher: IEEE, Pages: 2829-2832, ISSN: 1062-922X
  • Conference paper
    Vrielink TJCO, Chao M, Darzi A, Mylonas GPet al., 2018,

    ESD CYCLOPS: A new robotic surgical system for GI surgery

    , IEEE International Conference on Robotics and Automation (ICRA), Publisher: IEEE Computer Soc., Pages: 150-157, ISSN: 1050-4729

    Gastrointestinal (GI) cancers account for 1.5 million deaths worldwide. Endoscopic Submucosal Dissection (ESD) is an advanced therapeutic endoscopy technique with superior clinical outcome due to the minimally invasive and en bloc removal of tumours. In the western world, ESD is seldom carried out, due to its complex and challenging nature. Various surgical systems are being developed to make this therapy accessible, however, these solutions have shown limited operational workspace, dexterity, or low force exertion capabilities. The current paper shows the ESD CYCLOPS system, a bimanual surgical robotic attachment that can be mounted at the end of any flexible endoscope. The system is able to achieve forces of up to 46N, and showed a mean error of 0.217mm during an elliptical tracing task. The workspace and instrument dexterity is shown by pre-clinical ex vivo trials, in which ESD is successfully performed by a GI surgeon. The system is currently undergoing pre-clinical in vivo validation.

  • Conference paper
    Pittiglio G, Kogkas A, Vrielink JO, Mylonas Get al., 2018,

    Dynamic Control of Cable Driven Parallel Robots with Unknown Cable Stiffness: a Joint Space Approach

    , IEEE International Conference on Robotics and Automation (ICRA), Publisher: IEEE COMPUTER SOC, Pages: 948-955, ISSN: 1050-4729
  • Conference paper
    Runciman M, Darzi A, Mylonas G, 2018,

    Deployable disposable self-propelling and variable stiffness devices for minimally invasive surgery

    , Conference on New Technologies for Computer/Robot Assisted Surgery

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.

Request URL: http://www.imperial.ac.uk:80/respub/WEB-INF/jsp/search-t4-html.jsp Request URI: /respub/WEB-INF/jsp/search-t4-html.jsp Query String: id=1305&limit=10&page=6&respub-action=search.html Current Millis: 1730539266507 Current Time: Sat Nov 02 09:21:06 GMT 2024

Contact Us

General enquiries

Facility enquiries


The Hamlyn Centre
Bessemer Building
South Kensington Campus
Imperial College
London, SW7 2AZ
Map location