Sensing
A primary motivation of our research is the monitoring of physical, physiological, and biochemical parameters - in any environment and without activity restriction and behaviour modification - through using miniaturised, wireless Body Sensor Networks (BSN). Key research issues that are currently being addressed include novel sensor designs, ultra-low power microprocessor and wireless platforms, energy scavenging, biocompatibility, system integration and miniaturisation, processing-on-node technologies combined with novel ASIC design, autonomic sensor networks and light-weight communication protocols. Our research is aimed at addressing the future needs of life-long health, wellbeing and healthcare, particularly those related to demographic changes associated with an ageing population and patients with chronic illnesses. This research theme is therefore closely aligned with the IGHI’s vision of providing safe, effective and accessible technologies for both developed and developing countries.
Some of our latest works were exhibited at the 2015 Royal Society Summer Science Exhibition.
2 column colour block - Research areas
2 column colour block - Research areas 2
Results
- Showing results for:
- Reset all filters
Search results
-
Conference paperKassanos P, Seichepine F, Wales D, et al., 2019,
Towards a Flexible/Stretchable Multiparametric Sensing Device for Surgical and Wearable Applications
, IEEE Biomedical Circuits and Systems Conference (BioCAS), Publisher: IEEE, ISSN: 2163-4025 -
Conference paperRosa BG, Anastasova-Ivanova S, Yang GZ, 2019,
A Low-powered and Wearable Device for Monitoring Sleep through Electrical, Chemical and Motion signals recorded over the head
, IEEE Biomedical Circuits and Systems Conference (BioCAS), Publisher: IEEE, ISSN: 2163-4025- Author Web Link
- Cite
- Citations: 1
-
Conference paperSingh RK, Varghese RJ, Liu J, et al., 2019,
A multi-sensor fusion approach for intention detection
, 4th International Conference on NeuroRehabilitation (ICNR), Publisher: SPRINGER INTERNATIONAL PUBLISHING AG, Pages: 454-458, ISSN: 2195-3562For assistive devices to seamlessly and promptly assist users with activities of daily living (ADL), it is important to understand the user’s intention. Current assistive systems are mostly driven by unimodal sensory input which hinders their accuracy and responses. In this paper, we propose a context-aware sensor fusion framework to detect intention for assistive robotic devices which fuses information from a wearable video camera and wearable inertial measurement unit (IMU) sensors. A Naive Bayes classifier is used to predict the intent to move from IMU data and the object classification results from the video data. The proposed approach can achieve an accuracy of 85.2% in detecting movement intention.
-
Conference paperElson D, 2019,
Optical theranostics: image-guided cancer thermal therapy using light (invited)
, Computer Assisted Radiology and Surgery (Europe's Got Talent) -
Conference paperAdshead J, Oldfield F, Hadaschik B, et al., 2019,
Usability and technical feasibility evaluation of a tethered laparoscopic gamma probe for radioguided surgery in prostate cancer: a pelvic phantom and porcine model study (19-1271)
, Annual Meeting of the American Urological Association Education and Research Inc.
This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.