A primary motivation of our research is the monitoring of physical, physiological, and biochemical parameters - in any environment and without activity restriction and behaviour modification - through using miniaturised, wireless Body Sensor Networks (BSN). Key research issues that are currently being addressed include novel sensor designs, ultra-low power microprocessor and wireless platforms, energy scavenging, biocompatibility, system integration and miniaturisation, processing-on-node technologies combined with novel ASIC design, autonomic sensor networks and light-weight communication protocols. Our research is aimed at addressing the future needs of life-long health, wellbeing and healthcare, particularly those related to demographic changes associated with an ageing population and patients with chronic illnesses. This research theme is therefore closely aligned with the IGHI’s vision of providing safe, effective and accessible technologies for both developed and developing countries.

Some of our latest works were exhibited at the 2015 Royal Society Summer Science Exhibition.


Citation

BibTex format

@inproceedings{James:2007:10.1007/978-3-540-75759-7_14,
author = {James, A and Vieira, D and Lo, B and Darzi, A and Yang, GZ},
doi = {10.1007/978-3-540-75759-7_14},
pages = {110--117},
title = {Eye-gaze driven surgical workflow segmentation},
url = {http://dx.doi.org/10.1007/978-3-540-75759-7_14},
year = {2007}
}

RIS format (EndNote, RefMan)

TY  - CPAPER
AB - In today's climate of clinical governance there is growing pressure on surgeons to demonstrate their competence, improve standards and reduce surgical errors. This paper presents a study on developing a novel eye-gaze driven technique for surgical assessment and workflow recovery. The proposed technique investigates the use of a Parallel Layer Perceptor (PLP) to automate the recognition of a key surgical step in a porcine laparoscopic cholecystectomy model. The classifier is eye-gaze contingent but combined with image based visual feature detection for improved system performance. Experimental results show that by fusing image instrument likelihood measures, an overall classification accuracy of 75% is achieved. © Springer-Verlag Berlin Heidelberg 2007.
AU - James,A
AU - Vieira,D
AU - Lo,B
AU - Darzi,A
AU - Yang,GZ
DO - 10.1007/978-3-540-75759-7_14
EP - 117
PY - 2007///
SN - 0302-9743
SP - 110
TI - Eye-gaze driven surgical workflow segmentation
UR - http://dx.doi.org/10.1007/978-3-540-75759-7_14
ER -