Research in surgical robotics has an established track record at Imperial College, and a number of research and commercial surgical robot platforms have been developed over the years. The Hamlyn Centre is a champion for technological innovation and clinical adoption of robotic, minimally invasive surgery. We work in partnership with major industrial leaders in medical devices and surgical robots, as well as developing our own platforms such as the i-Snake® and Micro-IGES platforms. The Da Vinci surgical robot is used extensively for endoscopic radical prostatectomy, hiatal hernia surgery, and low pelvic and rectal surgery, and in 2003, St Mary’s Hospital carried out its first Totally Endoscopic Robotic Coronary Artery Bypass (TECAB).

The major focus of the Hamlyn Centre is to develop robotic technologies that will transform conventional minimally invasive surgery, explore new ways of empowering robots with human intelligence, and develop[ing miniature 'microbots' with integrated sensing and imaging for targeted therapy and treatment. We work closely with both industrial and academic partners in open platforms such as the DVRK, RAVEN and KUKA. The Centre also has the important mission of driving down costs associated with robotic surgery in order to make the technology more accessible, portable, and affordable. This will allow it to be fully integrated with normal surgical workflows so as to benefit a much wider patient population.

The Hamlyn Centre currently chairs the UK Robotics and Autonomous Systems (UK-RAS) Network. The mission of the Network is to to provide academic leadership in Robotics and Autonomous Systems (RAS), expand collaboration with industry and integrate and coordinate activities across the UK Engineering and Physical Sciences Research Council (EPSRC) funded RAS capital facilities and Centres for Doctoral Training (CDTs).

Search or filter publications

Filter by type:

Filter by publication type

Filter by year:



  • Showing results for:
  • Reset all filters

Search results

  • Conference paper
    Grammatikopoulou M, Zhang L, Yang G-Z, 2019,

    Depth Estimation of Optically Transparent Microrobots Using Convolutional and Recurrent Neural Networks

    , 25th IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Publisher: IEEE, Pages: 4895-4900, ISSN: 2153-0858
  • Conference paper
    Zhou X-Y, Riga C, Lee S-L, Yang G-Zet al., 2019,

    Towards Automatic 3D Shape Instantiation for Deployed Stent Grafts: 2D Multiple-class and Class-imbalance Marker Segmentation with Equally-weighted Focal U-Net

    , 25th IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Publisher: IEEE, Pages: 1261-1267, ISSN: 2153-0858
  • Book chapter
    Bernstein A, Varghese RJ, Liu J, Zhang Z, Lo Bet al., 2019,

    An Assistive Ankle Joint Exoskeleton for Gait Impairment

    , Biosystems and Biorobotics, Pages: 658-662

    © 2019, Springer Nature Switzerland AG. Motor rehabilitation and assistance post-stroke are becoming a major concern for healthcare services with an increasingly aging population. Wearable robots can be a technological solution to support gait rehabilitation and to provide assistance to enable users to carry out activities of daily living independently. To address the need for long-term assistance for stroke survivors suffering from drop foot, this paper proposes a low-cost, assistive ankle joint exoskeleton for gait assistance. The proposed exoskeleton is designed to provide ankle foot support thus enabling normal walking gait. Baseline gait reading was recorded from two force sensors attached to a custom-built shoe insole of the exoskeleton. From our experiments, the average maximum force during heel-strike (63.95 N) and toe-off (54.84 N) were found, in addition to the average period of a gait cycle (1.45 s). The timing and force data were used to control the actuation of tendons of the exoskeleton to prevent the foot from preemptively hitting the ground during swing phase.

  • Book chapter
    Singh RK, Varghese RJ, Liu J, Zhang Z, Lo Bet al., 2019,

    A multi-sensor fusion approach for intention detection

    , Biosystems and Biorobotics, Pages: 454-458

    © Springer Nature Switzerland AG 2019. For assistive devices to seamlessly and promptly assist users with activities of daily living (ADL), it is important to understand the user’s intention. Current assistive systems are mostly driven by unimodal sensory input which hinders their accuracy and responses. In this paper, we propose a context-aware sensor fusion framework to detect intention for assistive robotic devices which fuses information from a wearable video camera and wearable inertial measurement unit (IMU) sensors. A Naive Bayes classifier is used to predict the intent to move from IMU data and the object classification results from the video data. The proposed approach can achieve an accuracy of 85.2% in detecting movement intention.

  • Conference paper
    Kassanos P, Seichepine F, Wales D, Yang G-Zet al., 2019,

    Towards a Flexible/Stretchable Multiparametric Sensing Device for Surgical and Wearable Applications

    , IEEE Biomedical Circuits and Systems Conference (BioCAS), Publisher: IEEE, ISSN: 2163-4025

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.

Request URL: Request URI: /respub/WEB-INF/jsp/search-t4-html.jsp Query String: id=759&limit=5&page=8&respub-action=search.html Current Millis: 1597430521658 Current Time: Fri Aug 14 19:42:01 BST 2020