Imperial College London

DrPhilipPratt

Faculty of MedicineDepartment of Surgery & Cancer

 
 
 
//

Contact

 

+44 (0)20 3312 5525p.pratt Website

 
 
//

Location

 

005Paterson WingSt Mary's Campus

//

Summary

 

Publications

Publication Type
Year
to

76 results found

Pratt P, Mayer E, Vale J, Cohen D, Edwards E, Darzi A, Yang G-Zet al., 2011, Image-guided robotic partial nephrectomy: benefits and challenges, Hamlyn Symposium on Medical Robotics, Pages: 85-86

Conference paper

Sridhar AN, Cohen DC, Chen D, Pratt P, Khoubehi B, Vale JA, Yang GZ, Darzi AW, Mayer EK, Edwards Eet al., 2011, Development of an augmented reality system for robotic prostatectomy: Towards reducing the learning curve, European Urology Supplements, Vol: 10, Pages: 570-570, ISSN: 1569-9056

The advent and widespread acceptance of robotic-assisted laparoscopic prostatectomy (RALP) has necessitated the development of new skill sets for the trainee surgeon and those converting from open or conventional laparoscopic prostatectomy. The loss of haptic feedback during RALP can increase the challenges during training, with the surgeon having to rely solely on visual cues from the operative field. Augmented reality is a tool which can increase the amount of visual information available, in real time, to the surgeon hence guiding surgery and with the potential to reduce the learning curve. A recently conducted study at our institute identified certain steps of RALP that could benefit from AR guidance. This work presents the implementation of an image guidance system that was developed to assist the surgeon in these key stages of surgery.

Journal article

Stoyanov D, Scarzanella MV, Pratt P, Yang GZet al., 2010, Real-time stereo reconstruction in robotically assisted minimally invasive surgery, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Vol: 6361 LNCS, Pages: 275-282, ISSN: 0302-9743

The recovery of 3D tissue structure and morphology during robotic assisted surgery is an important step towards accurate deployment of surgical guidance and control techniques in minimally invasive therapies. In this article, we present a novel stereo reconstruction algorithm that propagates disparity information around a set of candidate feature matches. This has the advantage of avoiding problems with specular highlights, occlusions from instruments and view dependent illumination bias. Furthermore, the algorithm can be used with any feature matching strategy allowing the propagation of depth in very disparate views. Validation is provided for a phantom model with known geometry and this data is available online in order to establish a structured validation scheme in the field. The practical value of the proposed method is further demonstrated by reconstructions on various in vivo images of robotic assisted procedures, which are also available to the community. © 2010 Springer-Verlag.

Journal article

Pratt P, Stoyanov D, Visentini-Scarzanella M, Yang GZet al., 2010, Dynamic guidance for robotic surgery using image-constrained biomechanical models., Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention, Vol: 13, Pages: 77-85

The use of physically-based models combined with image constraints for intraoperative guidance is important for surgical procedures that involve large-scale tissue deformation. A biomechanical model of tissue deformation is described in which surface positional constraints and internally generated forces are derived from endoscopic images and preoperative 4D CT data, respectively. Considering cardiac motion, a novel technique is presented which minimises the average registration error over one or more complete cycles. Features tracked in the stereo video stream provide surface constraints, and an inverse finite element simulation is presented which allows internal forces to be recovered from known preoperative displacements. The accuracy of surface texture, segmented mesh and volumetrically rendered overlays is evaluated with detailed phantom experiments. Results indicate that by combining preoperative and intraoperative images in this manner, accurate intraoperative tissue deformation modelling can be achieved.

Journal article

Lee S-L, Huntbatch A, Pratt P, Lerotic M, Yang G-Zet al., 2010, In vivo and in situ image guidance and modelling in robotic assisted surgery, PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART C-JOURNAL OF MECHANICAL ENGINEERING SCIENCE, Vol: 224, Pages: 1421-1434, ISSN: 0954-4062

Journal article

Pratt P, Stoyanov D, Visentini-Scarzanella M, Yang GZet al., 2010, Dynamic guidance for robotic surgery using image-constrained biomechanical models, Med Image Comput Comput Assist Interv, Vol: 13, Pages: 77-85

The use of physically-based models combined with image constraints for intraoperative guidance is important for surgical procedures that involve large-scale tissue deformation. A biomechanical model of tissue deformation is described in which surface positional constraints and internally generated forces are derived from endoscopic images and preoperative 4D CT data, respectively. Considering cardiac motion, a novel technique is presented which minimises the average registration error over one or more complete cycles. Features tracked in the stereo video stream provide surface constraints, and an inverse finite element simulation is presented which allows internal forces to be recovered from known preoperative displacements. The accuracy of surface texture, segmented mesh and volumetrically rendered overlays is evaluated with detailed phantom experiments. Results indicate that by combining preoperative and intraoperative images in this manner, accurate intraoperative tissue deformation modelling can be achieved.

Journal article

Stoyanov D, Scarzanella MV, Pratt P, Yang GZet al., 2010, Real-time stereo reconstruction in robotically assisted minimally invasive surgery, Med Image Comput Comput Assist Interv, Vol: 13, Pages: 275-282

The recovery of 3D tissue structure and morphology during robotic assisted surgery is an important step towards accurate deployment of surgical guidance and control techniques in minimally invasive therapies. In this article, we present a novel stereo reconstruction algorithm that propagates disparity information around a set of candidate feature matches. This has the advantage of avoiding problems with specular highlights, occlusions from instruments and view dependent illumination bias. Furthermore, the algorithm can be used with any feature matching strategy allowing the propagation of depth in very disparate views. Validation is provided for a phantom model with known geometry and this data is available online in order to establish a structured validation scheme in the field. The practical value of the proposed method is further demonstrated by reconstructions on various in vivo images of robotic assisted procedures, which are also available to the community.

Journal article

Dowsey A, Pratt P, Visentini-Scarzanella M, Totz J, Lerotic M, Stoyanov D, Yang G-Zet al., 2009, A real-time simulation, guidance and visualisation platform for intra-operative minimally invasive robotic surgery, 1st NVIDIA GPU Technology Conference

Conference paper

Pratt P, Bello F, Edwards E, Rueckert D, Westwood JD, Haluck RS, Hoffman HM, Mogel GT, Phillips R, Robb RA, Vosburgh KGet al., 2008, Interactive finite element simulation of the beating heart for image-guided robotic cardiac surgery, 16th Conference on Medicine Meets Virtual Reality, Publisher: I O S PRESS, Pages: 378-383, ISSN: 0926-9630

Conference paper

Pratt P, 2008, Image guidance and surgery simulation using inverse nonlinear finite element methods, 4th International Symposium on Biomedical Simulation, Publisher: SPRINGER-VERLAG BERLIN, Pages: 185-190, ISSN: 0302-9743

Conference paper

Pratt P, 2008, Inverse nonlinear finite element methods for surgery simulation and image guidance, Computational Biomechanics for Medicine (III) Workshop, Pages: 48-55

Conference paper

Hu M, Penney GP, Rueckert D, Edwards PJ, Figl M, Pratt P, Hawkes DJet al., 2008, A novel algorithm for heart motion analysis based on geometric constraints, Publisher: Springer, Pages: 720-728

Conference paper

Pratt P, Bello F, Edwards E, Rueckert Det al., 2007, Finite element simulation of the beating heart for image-guided robotic cardiac surgery, Computational Biomechanics for Medicine (II) Workshop, Pages: 74-83

Conference paper

Crowhurst J, Plaat F, 1999, A reply, Anaesthesia, Vol: 54, Pages: 1117-1118, ISSN: 0003-2409

Journal article

Pratt P, 1994, Evolving neural networks to control unstable dynamical systems, Third Annual Conference on Evolutionary Programming, Publisher: World Scientific, Pages: 191-204

Conference paper

Martin G, Koizia L, Kooner A, Cafferkey J, Ross C, Purkayastha S, Sivananthan A, Tanna A, Pratt P, Kinross Jet al., Use of the HoloLens2 Mixed Reality Headset for Protecting Health Care Workers During the COVID-19 Pandemic: Prospective, Observational Evaluation (Preprint)

<sec> <title>BACKGROUND</title> <p>The coronavirus disease (COVID-19) pandemic has led to rapid acceleration in the deployment of new digital technologies to improve both accessibility to and quality of care, and to protect staff. Mixed-reality (MR) technology is the latest iteration of telemedicine innovation; it is a logical next step in the move toward the provision of digitally supported clinical care and medical education. This technology has the potential to revolutionize care both during and after the COVID-19 pandemic.</p> </sec> <sec> <title>OBJECTIVE</title> <p>This pilot project sought to deploy the HoloLens2 MR device to support the delivery of remote care in COVID-19 hospital environments.</p> </sec> <sec> <title>METHODS</title> <p>A prospective, observational, nested cohort evaluation of the HoloLens2 was undertaken across three distinct clinical clusters in a teaching hospital in the United Kingdom. Data pertaining to staff exposure to high-risk COVID-19 environments and personal protective equipment (PPE) use by clinical staff (N=28) were collected, and assessments of acceptability and feasibility were conducted.</p> </sec> <sec> <title>RESULTS</title> <p>The deployment of the HoloLens2 led to a 51.5% reduction in time exposed to harm for staff looking after COVID-19 patients (3.32 vs 1.63 hours/day/staff member; &lt;i&gt;P&lt;/i&gt;=.002), and an 83.1% reduction in the amount of PPE used (178 vs 30 items/round/day; &lt;i&gt;P&lt;/i&gt;=.02). This represents 222.98 hours of reduced staff e

Journal article

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.

Request URL: http://wlsprd.imperial.ac.uk:80/respub/WEB-INF/jsp/search-html.jsp Request URI: /respub/WEB-INF/jsp/search-html.jsp Query String: id=00667466&limit=30&person=true&page=3&respub-action=search.html