Use the links below to access our reports, or scroll down to use the search function to explore all of our publications including peer-reviewed papers and briefing papers.

Browse all publications

Search or filter publications

Filter by type:

Filter by publication type

Filter by year:

to

Results

  • Showing results for:
  • Reset all filters

Search results

  • Conference paper
    Vrielink TJCO, Puyal JG-B, Kogkas A, Darzi A, Mylonas Get al., 2019,

    Intuitive gaze-control of a robotized flexible endoscope

    , 25th IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Publisher: IEEE, Pages: 1776-1782, ISSN: 2153-0858

    Flexible endoscopy is a routinely performed procedure that has predominantly remained unchanged for decades despite its many challenges. This paper introduces a novel, more intuitive and ergonomic platform that can be used with any flexible endoscope, allowing easier navigation and manipulation. A standard endoscope is robotized and a gaze control system based on eye-tracking is developed and implemented, allowing hands-free manipulation. The system characteristics and step response has been evaluated using visual servoing. Further, the robotized system has been compared with a manually controlled endoscope during a user study. The users (n=11) showed a preference for the gaze controlled endoscope and a lower task load when the task was performed with the gaze control. In addition, gaze control was related to a higher success rate and a lower time to perform the task. The results presented validate the system's technical performance and demonstrate the intuitiveness of hands-free gaze control in flexible endoscopy.

  • Conference paper
    Wang M-Y, Kogkas AA, Darzi A, Mylonas GPet al., 2019,

    Free-view, 3D gaze-guided, assistive robotic system for activities of daily living

    , 25th IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Publisher: IEEE, Pages: 2355-2361, ISSN: 2153-0858

    Patients suffering from quadriplegia have limited body motion which prevents them from performing daily activities. We have developed an assistive robotic system with an intuitive free-view gaze interface. The user's point of regard is estimated in 3D space while allowing free head movement and is combined with object recognition and trajectory planning. This framework allows the user to interact with objects using fixations. Two operational modes have been implemented to cater for different eventualities. The automatic mode performs a pre-defined task associated with a gaze-selected object, while the manual mode allows gaze control of the robot's end-effector position on the user's frame of reference. User studies reported effortless operation in automatic mode. A manual pick and place task achieved a success rate of 100% on the users' first attempt.

  • Journal article
    Joshi M, Ashrafian H, Aufegger L, Khan S, Arora S, Cooke G, Darzi Aet al., 2019,

    Wearable sensors to improve detection of patient deterioration

    , Expert Review of Medical Devices, Vol: 16, Pages: 145-154, ISSN: 1743-4440

    INTRODUCTION: Monitoring a patient's vital signs forms a basic component of care, enabling the identification of deteriorating patients and increasing the likelihood of improving patient outcomes. Several paper-based track and trigger warning scores have been developed to allow clinical evaluation of a patient and guidance on escalation protocols and frequency of monitoring. However, evidence suggests that patient deterioration on hospital wards is still missed, and that patients are still falling through the safety net. Wearable sensor technology is currently undergoing huge growth, and the development of new light-weight wireless wearable sensors has enabled multiple vital signs monitoring of ward patients continuously and in real time. Areas covered: In this paper, we aim to closely examine the benefits of wearable monitoring applications that measure multiple vital signs; in the context of improving healthcare and delivery. A review of the literature was performed. Expert commentary: Findings suggest that several sensor designs are available with the potential to improve patient safety for both hospital patients and those at home. Larger clinical trials are required to ensure both diagnostic accuracy and usability.

  • Book chapter
    Oude Vrielink TJC, Vitiello V, Mylonas GP, 2019,

    Robotic surgery in cancer

    , Bioengineering Innovative Solutions for Cancer, Pages: 245-269, ISBN: 9780128138878

    Cancer is responsible for the death of thousands of people around the world. When diagnosed and treated at an early stage, long-term survival rates are very high, and today advanced screening has considerably improved early cancer detection and diagnosis. The contemporary view is that surgical cancer removal should also aim at minimizing the external and internal trauma to the patient, for improved postoperative healing and cosmesis. Toward these two goals, minimally invasive surgery (MIS) approaches strive for organ-sparing-as opposed to radical-surgery, using miniaturized surgical instruments introduced through small incisions on the patient skin, or through natural orifices. The operational challenges of such approaches are obvious. Robotic surgery, still in its infancy, is known to improve surgical practice. However, certain operational challenges remain and clear long-term evidence about the superior curative outcomes of robotic surgery over conventional approaches is a subject of debate. This chapter introduces MIS, discusses its limitations for both conventional and robotic approaches, and highlights future opportunities for improving surgical outcomes. Gastrointestinal cancer surgery is used as a representative case study.

  • Conference paper
    Leiloglou M, Chalau V, Kedrzycki M, Avila-Rencoret F, Li Q, Lin J, Hanna G, Darzi A, Leff D, Elson Det al., 2019,

    Snapshot Fluorescence Hyperspectral System for Breast Cancer Surgery Guidance

    , Hamlyn Symposium Advanced Biophotonics Workshop
  • Conference paper
    Leiloglou M, Chalau V, Kedrzycki M, Qi J, Martin-Gonzalez P, Hanna G, Darzi A, Leff D, Elson Det al., 2019,

    Fluorescence Intensity Image Guided Breast Conserving Surgery (BCS).

    , European Molecular Imaging Meeting
  • Conference paper
    Zhao M, Oude Vrielink J, Kogkas A, Mylonas G, Elson Det al., 2019,

    Prototype Designs of a Cable-driven Parallel Robot for Transoral Laser Surgery

    , Hamlyn Symposium on Medical Robotics
  • Conference paper
    Achanccaray D, Mylonas G, Andreu-Perez J, 2019,

    An Implicit Brain Computer Interface Supported by Gaze Monitoring for Virtual Therapy

    , IEEE International Conference on Systems, Man and Cybernetics (SMC), Publisher: IEEE, Pages: 2829-2832, ISSN: 1062-922X
  • Book chapter
    Kogkas A, Ezzat A, Thakkar R, Darzi A, Mylonas Get al., 2019,

    Free-View, 3D Gaze-Guided Robotic Scrub Nurse

    , Editors: Shen, Liu, Peters, Staib, Essert, Zhou, Yap, Khan, Publisher: SPRINGER INTERNATIONAL PUBLISHING AG, Pages: 164-172, ISBN: 978-3-030-32253-3
  • Conference paper
    Kassanos P, Seichepine F, Wales D, Yang G-Zet al., 2019,

    Towards a Flexible/Stretchable Multiparametric Sensing Device for Surgical and Wearable Applications

    , IEEE Biomedical Circuits and Systems Conference (BioCAS), Publisher: IEEE, ISSN: 2163-4025

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.

Request URL: http://www.imperial.ac.uk:80/respub/WEB-INF/jsp/search-t4-html.jsp Request URI: /respub/WEB-INF/jsp/search-t4-html.jsp Query String: id=281&limit=10&page=20&respub-action=search.html Current Millis: 1664368277313 Current Time: Wed Sep 28 13:31:17 BST 2022