Research in surgical robotics has an established track record at Imperial College, and a number of research and commercial surgical robot platforms have been developed over the years. The Hamlyn Centre is a champion for technological innovation and clinical adoption of robotic, minimally invasive surgery. We work in partnership with major industrial leaders in medical devices and surgical robots, as well as developing our own platforms such as the i-Snake® and Micro-IGES platforms. The Da Vinci surgical robot is used extensively for endoscopic radical prostatectomy, hiatal hernia surgery, and low pelvic and rectal surgery, and in 2003, St Mary’s Hospital carried out its first Totally Endoscopic Robotic Coronary Artery Bypass (TECAB).

The major focus of the Hamlyn Centre is to develop robotic technologies that will transform conventional minimally invasive surgery, explore new ways of empowering robots with human intelligence, and develop[ing miniature 'microbots' with integrated sensing and imaging for targeted therapy and treatment. We work closely with both industrial and academic partners in open platforms such as the DVRK, RAVEN and KUKA. The Centre also has the important mission of driving down costs associated with robotic surgery in order to make the technology more accessible, portable, and affordable. This will allow it to be fully integrated with normal surgical workflows so as to benefit a much wider patient population.

The Hamlyn Centre currently chairs the UK Robotics and Autonomous Systems (UK-RAS) Network. The mission of the Network is to to provide academic leadership in Robotics and Autonomous Systems (RAS), expand collaboration with industry and integrate and coordinate activities across the UK Engineering and Physical Sciences Research Council (EPSRC) funded RAS capital facilities and Centres for Doctoral Training (CDTs).


Citation

BibTex format

@article{Guo:2019:10.1109/LRA.2019.2928775,
author = {Guo, Y and Deligianni, F and Gu, X and Yang, G-Z},
doi = {10.1109/LRA.2019.2928775},
journal = {IEEE Robotics and Automation Letters},
pages = {3617--3624},
title = {3-D Canonical pose estimation and abnormal gait recognition with a single RGB-D camera},
url = {http://dx.doi.org/10.1109/LRA.2019.2928775},
volume = {4},
year = {2019}
}

RIS format (EndNote, RefMan)

TY  - JOUR
AB - Assistive robots play an important role in improvingthe quality of life of patients at home. Among all the monitoringtasks, gait disorders are prevalent in elderly and people with neurological conditions and this increases the risk of fall. Therefore,the development of mobile systems for gait monitoring at home innormal living conditions is important. Here, we present a mobilesystem that is able to track humans and analyze their gait incanonical coordinates based on a single RGB-D camera. First,view-invariant three-dimensional (3-D) lower limb pose estimationis achieved by fusing information from depth images along with2-D joints derived in RGB images. Next, both the 6-D camerapose and the 3-D lower limb skeleton are real-time tracked in acanonical coordinate system based on simultaneously localizationand mapping (SLAM). A mask-based strategy is exploited to improve the re-localization of the SLAM in dynamic environments.Abnormal gait is detected by using the support vector machine andthe bidirectional long-short term memory network with respect toa set of extracted gait features. To evaluate the robustness of thesystem, we collected multi-cameras, ground truth data from 16healthy volunteers performing 6 gait patterns that mimic commongait abnormalities. The experiment results demonstrate that ourproposed system can achieve good lower limb pose estimation andsuperior recognition accuracy compared to previous abnormal gaitdetection methods.
AU - Guo,Y
AU - Deligianni,F
AU - Gu,X
AU - Yang,G-Z
DO - 10.1109/LRA.2019.2928775
EP - 3624
PY - 2019///
SN - 2377-3766
SP - 3617
TI - 3-D Canonical pose estimation and abnormal gait recognition with a single RGB-D camera
T2 - IEEE Robotics and Automation Letters
UR - http://dx.doi.org/10.1109/LRA.2019.2928775
UR - http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000477983400027&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=1ba7043ffcc86c417c072aa74d649202
UR - http://hdl.handle.net/10044/1/73046
VL - 4
ER -