Publications
117 results found
Mylonas G, Sun LW, Kwok K-W, et al., 2011, Collaborative Gaze Channelling for Cooperation within a Shared Tele-Surgery Environment, Hamlyn Symposium
Fujii K, Mylonas GP, Yang G-Z, 2011, Stealth Calibration Eye Tracking Algorithm for Minimally Invasive Surgery, Hamlyn Symposium 2011
Noonan DP, Mylonas GP, Shang J, et al., 2010, Gaze contingent control for an articulated mechatronic laparoscope, Pages: 759-764
Paggetti G, Menegaz G, Leff D, et al., 2010, An Assessment of Parietal Function during Depth Perception and Coordination in Surgical Robotics, 16th Annual Meeting of the Organization for Human Brain Mapping (HBM)
James DRC, Orihuela-Espina F, Leff DR, et al., 2010, Cognitive Burden Estimation for Visuomotor Learning with fNIRS, 13th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI), Publisher: SPRINGER-VERLAG BERLIN, Pages: 319-326, ISSN: 0302-9743
- Author Web Link
- Cite
- Citations: 17
Noonan D, Elson D, Mylonas G, et al., 2009, Laser Induced Fluorescence and Reflected White Light Imaging for Robot-Assisted MIS, IEEE Transactions on Biomedical Engineering, Vol: 56, Pages: 889-892
Visentini-Scarzanella M, Mylonas GP, Stoyanov D, et al., 2009, <i>i</i>-BRUSH: A Gaze-Contingent Virtual Paintbrush for Dense 3D Reconstruction in Robotic Assisted Surgery, 12th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI2009), Publisher: SPRINGER-VERLAG BERLIN, Pages: 353-+, ISSN: 0302-9743
- Author Web Link
- Cite
- Citations: 16
Kwok KW, Sun LW, Vitiello V, et al., 2009, Perceptually docked control environment for multiple microbots: application to the gastric wall biopsy, IEEE/RSJ International Conference on Intelligent Robots and Systems, Pages: 2783-2788
Kwok K-W, Mylonas GP, Sun LW, et al., 2009, Dynamic Active Constraints for Hyper-Redundant Flexible Robots, 12th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI2009), Publisher: SPRINGER-VERLAG BERLIN, Pages: 410-+, ISSN: 0302-9743
- Author Web Link
- Cite
- Citations: 22
Stoyanov D, Mylonas GP, Lerotic M, et al., 2008, Intra-Operative Visualizations: Perceptual Fidelity and Human Factors, JOURNAL OF DISPLAY TECHNOLOGY, Vol: 4, Pages: 491-501, ISSN: 1551-319X
- Author Web Link
- Cite
- Citations: 19
Lo B, Chung AJ, Stoyanov D, et al., 2008, Real-time intra-operative 3D tissue deformation recovery, Pages: 1387 -1390-1387 -1390
Mylonas GP, Yang G-Z, 2008, Eye Tracking and Depth from Vergence, NEXT GENERATION ARTIFICIAL VISION SYSTEMS: REVERSE ENGINEERING THE HUMAN VISUAL SYSTEM, Editors: Bharath, Petrou, Publisher: ARTECH HOUSE, Pages: 191-215, ISBN: 978-1-59693-224-1
Yang G-Z, Mylonas GP, Kwok K-W, et al., 2008, Perceptual docking for robotic control, 4th International Workshop on Medical Imaging and Augmented Reality, Publisher: SPRINGER-VERLAG BERLIN, Pages: 21-30, ISSN: 0302-9743
- Author Web Link
- Cite
- Citations: 17
Mylonas GP, Kwok K-W, Darzi A, et al., 2008, Gaze-Contingent Motor Channelling and Haptic Constraints for Minimally Invasive Robotic Surgery, 11th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI2008), Publisher: SPRINGER-VERLAG BERLIN, Pages: 676-683, ISSN: 0302-9743
- Author Web Link
- Cite
- Citations: 17
Mylonas G, Yang GZ, 2008, Eye Tracking and Depth from Vergence, Next Generation Artificial Vision Systems: Reverse Engineering the Human Visual System, Editors: Bharath, Petrou, Publisher: ARTech House, Pages: 187-211
Stoyanov D, Mylonas GP, Yang G-Z, 2008, Gaze-Contingent 3D Control for Focused Energy Ablation in Robotic Assisted Surgery, 11th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI2008), Publisher: SPRINGER-VERLAG BERLIN, Pages: 347-355, ISSN: 0302-9743
- Author Web Link
- Cite
- Citations: 6
Leong JJH, Atallah L, Mylonas GP, et al., 2008, Investigation of partial directed coherence for hand-eye coordination in laparoscopic training, 4th International Workshop on Medical Imaging and Augmented Reality, Publisher: SPRINGER-VERLAG BERLIN, Pages: 270-+, ISSN: 0302-9743
- Author Web Link
- Cite
- Citations: 4
Noonan D, Mylonas G, Darzi A, et al., 2008, Gaze Contingent Articulated Robot Control for Robot Assisted Minimally Invasive Surgery, International Conference on Intelligent Robots and Systems, Publisher: IEEE/RSJ, Pages: 1186-1191
This paper introduces a novel technique for controlling an articulated robotic device through the eyes of the surgeon during minimally invasive surgery. The system consists of a binocular eye-tracking unit and a robotic instrument featuring a long, rigid shaft with an articulated distal tip for minimally invasive interventions. They have been integrated into a daVinci surgical robot to provide a seamless and non-invasive localization of eye fixations of the surgeon. By using a gaze contingent framework, the surgeon's fixations in 3D are converted into commands that direct the robotic probe to the desired location. Experimental results illustrate the ability of the system to perform real-time gaze contingent robot control and opens up a new avenue for improving current human-robot interfaces.
Mylonas GP, Stoyanov D, Darzi A, et al., 2007, Assessment of perceptual quality for gaze-contingent motion stabilization in robotic assisted minimally invasive surgery, Pages: 660-667, ISSN: 0302-9743
With the increasing sophistication of surgical robots, the use of motion stabilisation for enhancing the performance of micro-surgical tasks is an actively pursued research topic. The use of mechanical stabilisation devices has certain advantages, in terms of both simplicity and consistency. The technique, however, can complicate the existing surgical workflow and interfere with an already crowded MIS operated cavity. With the advent of reliable vision-based real-time and in situ in vivo techniques on 3D-deformation recovery, current effort is being directed towards the use of optical based techniques for achieving adaptive motion stabilisation. The purpose of this paper is to assess the effect of virtual stabilization on foveal/parafoveal vision during robotic assisted MIS. Detailed psychovisual experiments have been performed. Results show that stabilisation of the whole visual field is not necessary and it is sufficient to perform accurate motion tracking and deformation compensation within a relatively small area that is directly under foveal vision. The results have also confirmed that under the current motion stabilisation regime, the deformation of the periphery does not affect the visual acuity and there is no indication of the deformation velocity of the periphery affecting foveal sensitivity. These findings are expected to have a direct implication on the future design of visual stabilisation methods for robotic assisted MIS. © Springer-Verlag Berlin Heidelberg 2007.
Leong JJ, Nicolaou M, Atallah L, et al., 2007, HMM assessment of quality of movement trajectory in laparoscopic surgery., Medical Image Computing and Computer Aided Intervention - MICCAI 2006, Pages: 335-346
Lerotic M, Chung A, Mylonas G, et al., 2007, pq-space Based Non-Photorealistic Rendering for Augmented Reality, Medical Image Computing and Computer-Assisted Intervention – MICCAI 2007, Publisher: Springer Berlin Heidelberg, Pages: 102-109
Mylonas GP, Darzi A, Yang GZ, 2006, Gaze-contingent control for minimally invasive robotic surgery., Comput Aided Surg, Vol: 11, Pages: 256-266, ISSN: 1092-9088
OBJECTIVE: Recovering tissue depth and deformation during robotically assisted minimally invasive procedures is an important step towards motion compensation, stabilization and co-registration with preoperative data. This work demonstrates that eye gaze derived from binocular eye tracking can be effectively used to recover 3D motion and deformation of the soft tissue. METHODS: A binocular eye-tracking device was integrated into the stereoscopic surgical console. After calibration, the 3D fixation point of the participating subjects could be accurately resolved in real time. A CT-scanned phantom heart model was used to demonstrate the accuracy of gaze-contingent depth extraction and motion stabilization of the soft tissue. The dynamic response of the oculomotor system was assessed with the proposed framework by using autoregressive modeling techniques. In vivo data were also used to perform gaze-contingent decoupling of cardiac and respiratory motion. RESULTS: Depth reconstruction, deformation tracking, and motion stabilization of the soft tissue were possible with binocular eye tracking. The dynamic response of the oculomotor system was able to cope with frequencies likely to occur under most routine minimally invasive surgical operations. CONCLUSION: The proposed framework presents a novel approach towards the tight integration of a human and a surgical robot where interaction in response to sensing is required to be under the control of the operating surgeon.
Leong J, Nicolaou M, Atallah L, et al., 2006, HMM Assessment of Quality of Movement Trajectory in Laparoscopic Surgery, Medical Image Computing and Computer Aided Intervention - MICCAI 2006
Leong JJ, Nicolaou M, Atallah L, et al., 2006, HMM assessment of quality of movement trajectory in laparoscopic surgery., Med Image Comput Comput Assist Interv Int Conf Med Image Comput Comput Assist Interv., Vol: 9, Pages: 752-759
Stoyanov D, Mylonas G, Deligianni F, et al., 2005, Soft-tissue motion tracking and structure estimation for robotic assisted MIS procedures, Medical Image Computing and Computer Assisted Intervention (MICCAI05), Pages: 139-146
In robotically assisted laparoscopic surgery, soft-tissue motion tracking and structure recovery are important for intraoperative surgical guidance, motion compensation and delivering active constraints. In this paper, we present a novel method for feature based motion tracking of deformable soft-tissue surfaces in totally endoscopic coronary artery bypass graft (TECAB) surgery. We combine two feature detectors to recover distinct regions on the epicardial surface for which the sparse 3D surface geometry may be computed using a pre-calibrated stereo laparoscope. The movement of the 3D points is then tracked in the stereo images with stereo-temporal constrains by using an iterative registration algorithm. The practical value of the technique is demonstrated on both a deformable phantom model with tomographically derived surface geometry and in vivo robotic assisted minimally invasive surgery (MIS) image sequences.
Mylonas G, Stoyanov D, Deligianni F, et al., 2005, Gaze-contingent soft tissue deformation tracking for minimally invasive robotic surgery, Medical Image Computing and Computer Assisted Intervention (MICCAI05), Pages: 843-850
The introduction of surgical robots in Minimally Invasive Surgery (MIS) has allowed enhanced manual dexterity through the use of microprocessor controlled mechanical wrists. Although fully autonomous robots are attractive, both ethical and legal barriers can prohibit their practical use in surgery. The purpose of this paper is to demonstrate that it is possible to use real-time binocular eye tracking for empowering robots with human vision by using knowledge acquired in situ. By utilizing the close relationship between the horizontal disparity and the depth perception varying with the viewing distance, it is possible to use ocular vergence for recovering 3D motion and deformation of the soft tissue during MIS procedures. Both phantom and in vivo experiments were carried out to assess the potential frequency limit of the system and its intrinsic depth recovery accuracy. The potential applications of the technique include motion stabilization and intra-operative planning in the presence of large tissue deformation.
Mylonas GP, Darzi A, Yang GZ, et al., 2004, Gaze contingent depth recovery and motion stabilisation for minimally invasive robotic surgery, Berlin, 2nd international workshop on medical imaging and augmented reality (MIAR 2004), Beijing, Peoples Republic of China, Publisher: Springer-Verlag, Pages: 311-319
This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.