Imperial College London

Dr George Mylonas

Faculty of MedicineDepartment of Surgery & Cancer

Lecturer in Robotics and Technology in Cancer
 
 
 
//

Contact

 

+44 (0)20 3312 5145george.mylonas Website

 
 
//

Location

 

Room 5Paterson WingSt Mary's Campus

//

Summary

 

Publications

Publication Type
Year
to

73 results found

Ezzat A, Thakkar R, Kogkas A, Mylonas Get al., 2019, Perceptions of surgeons and scrub nurses towards a novel eye-tracking based robotic scrub nurse platform, International Surgical Congress of the Association-of-Surgeons-of-Great-Britain-and-Ireland (ASGBI), Publisher: WILEY, Pages: 81-82, ISSN: 0007-1323

Conference paper

Runciman M, Darzi A, Mylonas G, 2019, Soft robotics in minimally invasive surgery, Soft Robotics, Vol: 6, Pages: 423-443, ISSN: 2169-5172

Soft robotic devices have desirable traits for applications in minimally invasive surgery (MIS) but many interdisciplinary challenges remain unsolved. To understand current technologies, we carried out a keyword search using the Web of Science and Scopus databases, applied inclusion and exclusion criteria, and compared several characteristics of the soft robotic devices for MIS in the resulting articles. There was low diversity in the device designs and a wide-ranging level of detail regarding their capabilities. We propose a standardised comparison methodology to characterise soft robotics for various MIS applications, which will aid designers producing the next generation of devices.

Journal article

Fathi J, Vrielink TJCO, Runciman MS, Mylonas GPet al., 2019, A deployable soft robotic arm with stiffness modulation for assistive living applications, Pages: 1479-1485, ISSN: 1050-4729

© 2019 IEEE. This paper presents a three-tendon actuated continuum robot with an origami backbone to assist the elderly and physically impaired individuals in performing activities of daily living. The proposed design solution is an inherently safe and cost-effective alternative to current assistive robots. The origami backbone based on a variation of the Yoshimura pattern provides controlled deployment of the robot and enables length variation (15 cm - 56 cm) in order to increase the reachable workspace. A pneumatic stiffness mechanism was implemented, increasing the weight bearing capabilities of the continuum robot to 500 g. This new stiffness modulation approach was assessed with the use of several testing rigs. Additionally, the robot is joypad controlled and is easily transportable due to its high packing efficiency of 73% and light weight of 1.3 kg for the main body (including the actuation system). For demonstration of usability studies, the robot was successfully tested at a simulated kitchen terminal and also performed pick and place tasks.

Conference paper

Avery J, Runciman M, Darzi A, Mylonas GPet al., 2019, Shape sensing of variable stiffness soft robots using electrical impedance tomography

© 2019 IEEE. Soft robotic systems offer benefits over traditional rigid systems through reduced contact trauma with soft tissues and by enabling access through tortuous paths in minimally invasive surgery. However, the inherent deformability of soft robots places both a greater onus on accurate modelling of their shape, and greater challenges in realising intraoperative shape sensing. Herein we present a proprioceptive (self-sensing) soft actuator, with an electrically conductive working fluid. Electrical impedance measurements from up to six electrodes enabled tomographic reconstructions using Electrical Impedance Tomography (EIT). A new Frequency Division Multiplexed (FDM) EIT system was developed capable of measurements of 66 dB SNR with 20 ms temporal resolution. The concept was examined in two two-degree-of-freedom designs: a hydraulic hinged actuator and a pneumatic finger actuator with hydraulic beams. Both cases demonstrated that impedance measurements could be used to infer shape changes, and EIT images reconstructed during actuation showed distinct patterns with respect to each degree of freedom (DOF). Whilst there was some mechanical hysteresis observed, the repeatability of the measurements and resultant images was high. The results show the potential of FDM-EIT as a low-cost, low profile shape sensor in soft robots.

Working paper

Vrielink TJCO, Puyal JG-B, Kogkas A, Darzi A, Mylonas Get al., 2019, Intuitive Gaze-Control of a Robotized Flexible Endoscope, 25th IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Publisher: IEEE, Pages: 1776-1782, ISSN: 2153-0858

Conference paper

Wang M-Y, Kogkas AA, Darzi A, Mylonas GPet al., 2019, Free-View, 3D Gaze-Guided, Assistive Robotic System for Activities of Daily Living, 25th IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Publisher: IEEE, Pages: 2355-2361, ISSN: 2153-0858

Conference paper

Kogkas A, Ezzat A, Thakkar R, Darzi A, Mylonas Get al., 2019, Free-View, 3D Gaze-Guided Robotic Scrub Nurse, Lecture Notes in Computer Science, Publisher: Springer International Publishing, Pages: 164-172, ISBN: 9783030322533

Book chapter

Pittiglio G, Kogkas A, Vrielink JO, Mylonas Get al., 2018, Dynamic Control of Cable Driven Parallel Robots with Unknown Cable Stiffness: a Joint Space Approach, IEEE International Conference on Robotics and Automation (ICRA), Publisher: IEEE COMPUTER SOC, Pages: 948-955, ISSN: 1050-4729

Conference paper

Vrielink TJCO, Chao M, Darzi A, Mylonas GPet al., 2018, ESD CYCLOPS: A new robotic surgical system for GI surgery, IEEE International Conference on Robotics and Automation (ICRA), Publisher: IEEE COMPUTER SOC, Pages: 150-157, ISSN: 1050-4729

Conference paper

Miyashita K, Oude Vrielink T, Mylonas G, 2018, A cable-driven parallel manipulator with force sensing capabilities for high-accuracy tissue endomicroscopy, International Journal of Computer Assisted Radiology and Surgery, Vol: 13, Pages: 659-669, ISSN: 1861-6429

PURPOSE: Endomicroscopy (EM) provides high resolution, non-invasive histological tissue information and can be used for scanning of large areas of tissue to assess cancerous and pre-cancerous lesions and their margins. However, current robotic solutions do not provide the accuracy and force sensitivity required to perform safe and accurate tissue scanning. METHODS: A new surgical instrument has been developed that uses a cable-driven parallel mechanism (CPDM) to manipulate an EM probe. End-effector forces are determined by measuring the tensions in each cable. As a result, the instrument allows to accurately apply a contact force on a tissue, while at the same time offering high resolution and highly repeatable probe movement. RESULTS: 0.2 and 0.6 N force sensitivities were found for 1 and 2 DoF image acquisition methods, respectively. A back-stepping technique can be used when a higher force sensitivity is required for the acquisition of high quality tissue images. This method was successful in acquiring images on ex vivo liver tissue. CONCLUSION: The proposed approach offers high force sensitivity and precise control, which is essential for robotic EM. The technical benefits of the current system can also be used for other surgical robotic applications, including safe autonomous control, haptic feedback and palpation.

Journal article

Avila Rencoret FB, Mylonas G, Elson D, 2018, Robotic wide-field optical biopsy endoscopy, OSA Biophotonics Congress 2018, Publisher: OSA publishing

This paper describes a novel robotic framework for wide-field optical biopsy endoscopy, characterizes in vitro its spatial and spectral resolution, real time hyperspectral tissue classification, and demonstrates its feasibility on fresh porcine cadaveric colon.

Conference paper

Avila Rencoret FB, Mylonas GP, Elson D, Robotic Wide-Field Optical Biopsy Imaging For Flexible Endoscopy, 26th International Congress of the European Association for Endoscopic Surgery (EAES)

Conference paper

Elson D, Avila Rencoret F, Mylonas G, Robotic Wide-Field Optical Biopsy Imaging for Flexible Endoscopy (Gerhard Buess Technology Award), 26th Annual International EAES Congress

Conference paper

Zhao M, Oude Vrielink T, Elson D, Mylonas Get al., Endoscopic TORS-CYCLOPS: A Novel Cable-driven Parallel Robot for Transoral Laser Surgery, 26th Annual International EAES Congress

Conference paper

Ashraf H, Sodergren M, Merali N, Mylonas G, Singh H, Darzi Aet al., 2017, Eye-tracking technology in medical education: A systematic review, Medical Teacher, Vol: 40, Pages: 62-69, ISSN: 0142-159X

Background: Eye-tracking technology is an established research tool within allied industries such as advertising, psychology and aerospace. This review aims to consolidate literature describing the evidence for use of eye-tracking as an adjunct to traditional teaching methods in medical education.Methods: A systematic literature review was conducted in line with STORIES guidelines. A search of EMBASE, OVID MEDLINE, PsycINFO, TRIP database, and Science Direct was conducted until January 2017. Studies describing the use of eye-tracking in the training, assessment, and feedback of clinicians were included in the review.Results: Thirty-three studies were included in the final qualitative synthesis. Three studies were based on the use of gaze training, three studies on the changes in gaze behavior during the learning curve, 17 studies on clinical assessment and six studies focused on the use of eye-tracking methodology as a feedback tool. The studies demonstrated feasibility and validity in the use of eye-tracking as a training and assessment method.Conclusions: Overall, eye-tracking methodology has contributed significantly to the training, assessment, and feedback practices used in the clinical setting. The technology provides reliable quantitative data, which can be interpreted to give an indication of clinical skill, provide training solutions and aid in feedback and reflection. This review provides a detailed summary of evidence relating to eye-tracking methodology and its uses as a training method, changes in visual gaze behavior during the learning curve, eye-tracking methodology for proficiency assessment and its uses as a feedback tool.

Journal article

Kogkas AA, Darzi A, Mylonas GP, 2017, Gaze-contingent perceptually enabled interactions in the operating theatre., International Journal of Computer Assisted Radiology and Surgery, Vol: 12, Pages: 1131-1140, ISSN: 1861-6410

PURPOSE: Improved surgical outcome and patient safety in the operating theatre are constant challenges. We hypothesise that a framework that collects and utilises information -especially perceptually enabled ones-from multiple sources, could help to meet the above goals. This paper presents some core functionalities of a wider low-cost framework under development that allows perceptually enabled interaction within the surgical environment. METHODS: The synergy of wearable eye-tracking and advanced computer vision methodologies, such as SLAM, is exploited. As a demonstration of one of the framework's possible functionalities, an articulated collaborative robotic arm and laser pointer is integrated and the set-up is used to project the surgeon's fixation point in 3D space. RESULTS: The implementation is evaluated over 60 fixations on predefined targets, with distances between the subject and the targets of 92-212 cm and between the robot and the targets of 42-193 cm. The median overall system error is currently 3.98 cm. Its real-time potential is also highlighted. CONCLUSIONS: The work presented here represents an introduction and preliminary experimental validation of core functionalities of a larger framework under development. The proposed framework is geared towards a safer and more efficient surgical theatre.

Journal article

Mylonas G, patel N, teare J, darzi Aet al., 2017, CYCLOPS: An endoscope attachment for Endoscopic Submucosal Dissection, SAGES 2017 Annual Meeting

Poster

Oude Vrielink TJC, Darzi, Mylonas G, 2016, microCYCLOPS: A Robotic System for Microsurgical Applications, 6th Joint Workshop on New Technologies for Computer/Robot Assisted Surgery (CRAS 2016)

Conference paper

Avila-Rencoret F, Oude Vrielink T, Elson DS, Mylonas Get al., EndoSDR: Concurrent Endoscopic Screening, Diagnosis, and Removal of GI cancers (prize winner), Business Engineering and Surgical Technologies Innovation Symposium (BEST)

Conference paper

Avila Rencoret FB, Elson D, Mylonas G, A Robotic Hyperspectral Scanning Framework for Endoscopy, CRAS - Workshop on Computer/Robot Assisted Surgery

Gastrointestinal (GI) endoscopy is the gold-standard procedure for detection and treatment of dysplastic lesions and early stage GI cancers. Despite its proven effectiveness, its sensitivity remains suboptimal due to the subjective nature of the examination, which is substantially reliant on human-operator skills. For bowel cancer, colonoscopy can miss up to 22% of dysplastic lesions, with even higher miss rates for small (<5 mm diameter) and flat lesions. We propose a robotic hyperspectral (HS) scanning framework that aims to improve the sensitivity of GI endoscopy by automated scanning and real-time classification of wide tissue areas based on their HS features. A “hot-spot” map is generated to highlight dysplastic or cancerous lesions for further scrutiny or concurrent resection. The device works as an add-on accessory to any conventional endoscope, and to our knowledge, is the first of its kind. This paper focuses on characterising its optical resolution on rigid and deformable colon phantoms. We report for the first time 2D and 3D wide-area reconstruction of endoscopic HS data with sub-millimetre optical resolution. The current setup, compatible with the anatomical dimensions of the colon, could allow the identification of flat and small pre-cancerous lesions that are currently missed. The proposed framework will lay the foundations towards the next generation of augmented reality endoscopy while increasing its sensitivity and specificity.

Conference paper

Kogkas A, Darzi A, Mylonas GP, Gaze-Driven Human-Robot Interaction in the Operating Theatre, 6th Joint Workshop on New Technologies for Computer/Robot Assisted Surgery (CRAS 2016)

Conference paper

Khan DZ, Oude Vrielink TJC, Marcus H, Darzi A, Mylonas Get al., NeuroCYCLOPS: development and preclinical validation of a robotic platform for endoscopic neurosurgery, European Association of Neurosurgical Societies (EANS 2016), Publisher: European Association of Neurosurgical Societies

Conference paper

Oude Vrielink TJC, Khan DZ, Marcus H, Darzi A, Mylonas Get al., 2016, NeuroCYCLOPS: a novel system for endoscopic neurosurgery, London, The Hamlyn Symposium on Medical Robotics, Publisher: Imperial College London, Pages: 36-37

Conference paper

Kogkas AA, Sodergren M, Darzi A, Mylonas Get al., Macro- and micro-scale 3D gaze tracking in the operating theatre, The Hamlyn Symposium on Medical Robotics 2016, Publisher: Imperial College London, Pages: 100-101

Conference paper

Leff DR, James D, Orihuela-Espina F, Kwok KW, Sun L, Mylonas G, Athanasiou T, Darzi A, Yang GZet al., The impact of expert visual guidance on trainee visual search strategy, visual attention and motor skills, Frontiers in Human Neuroscience, Vol: 9, ISSN: 1662-5161

Minimally invasive and robotic surgery changes the capacity for surgical mentors to guide their trainees with the control customary to open surgery. This neuroergonomic study aims to assess a “Collaborative Gaze Channel” (CGC); which detects trainer gaze-behaviour and displays the point of regard to the trainee. A randomised crossover study was conducted in which twenty subjects performed a simulated robotic surgical task necessitating collaboration either with verbal (control condition) or visual guidance with CGC (study condition). Trainee occipito-parietal (O-P) cortical function was assessed with optical topography (OT) and gaze-behaviour was evaluated using video-oculography. Performance during gaze-assistance was significantly superior [biopsy number: (mean ± SD): control=5·6±1·8 vs. CGC=6·6±2·0; p< 0.05] and was associated with significantly lower O-P cortical activity [∆HbO2 mMol x cm [median (IQR)] control = 2.5 (12.0) vs. CGC 0.63 (11.2), p < 0.001]. A random effect model confirmed the association between guidance mode and O-P excitation. Network cost and global efficiency and global efficiency were not significantly influenced by guidance mode. A gaze channel enhances performance, modulates visual search, and alleviates the burden in brain centres subserving visual attention and does not induce changes in the trainee's O-P functional network observable with the current OT technique. The results imply that through visual guidance, attentional resources may be liberated, potentially improving the capability trainees to attend to other safety critical events during the procedure.

Journal article

Avila Rencoret FB, Elson DS, Mylonas G, 2015, Probe Deployment Device

Patent

Avila-Rencoret FB, Elson DS, Mylonas G, 2015, Towards a robotic-assisted cartography of the colon: a proof of concept, Publisher: IEEE COMPUTER SOC, Pages: 1757-1763

Conference paper

Paggetti G, Leff DR, Orihuela-Espina F, Mylonas G, Darzi A, Yang G-Z, Menegaz Get al., 2014, The role of the posterior parietal cortex in stereopsis and hand-eye coordination during motor task behaviours, Cognitive Processing, Vol: 16, Pages: 177-190, ISSN: 1612-4790

Journal article

Mylonas GP, Vitiello V, Cundy TP, Darzi A, Yang G-Zet al., 2014, CYCLOPS: A versatile robotic tool for bimanual single-access and natural-orifice endoscopic surgery, IEEE International Conference on Robotics and Automation (ICRA), Publisher: IEEE, Pages: 2436-2442

This paper introduces the CYCLOPS, a novel robotic tool for single-access and natural-orifice endoscopic surgery. Based on the concept of tendon-driven parallel robots, this highly original design gives the system some of its unique capabilities. Just to name a few, unparalleled force exertion capabilities of up to 65N, large and adjustable workspace, bimanual instrument triangulation. Due to the simplicity and nature of the design, the system could be adapted to an existing laparoscope or flexible endoscope. This promises a more immediate and accelerated route to clinical translation not only through endearing low-cost and adaptive features, but also by directly addressing several major barriers of existing designs.

Conference paper

Zhang L, Lee S-L, Yang G-Z, Mylonas GPet al., 2014, Semi-autonomous navigation for robot assisted tele-echography using generalized shape models and co-registered RGB-D cameras, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2014), Publisher: IEEE, Pages: 3496-3502

— This paper proposes a semi-autonomous navigatedmaster-slave system, for robot assisted remote echography forearly trauma assessment. Two RGB-D sensors are used tocapture real-time 3D information of the scene at the slave sidewhere the patient is located. A 3D statistical shape model isbuilt and used to generate a customized patient model basedon the point cloud generated by the RGB-D sensors. Thecustomized patient model can be updated and adaptively fittedto the patient. The model is also used to generate a trajectoryto navigate a KUKA robotic arm and safely conduct theultrasound examination. Extensive validation of the proposedsystem shows promising results in terms of accuracy androbustness.

Conference paper

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.

Request URL: http://wlsprd.imperial.ac.uk:80/respub/WEB-INF/jsp/search-html.jsp Request URI: /respub/WEB-INF/jsp/search-html.jsp Query String: respub-action=search.html&id=00335529&limit=30&person=true