Imperial College London

Dr Dandan Zhang

Faculty of EngineeringDepartment of Bioengineering

Lecturer in Artificial Intelligence & Machine Learning
 
 
 
//

Contact

 

d.zhang17 Website

 
 
//

Location

 

402Sir Michael Uren HubWhite City Campus

//

Summary

 

Publications

Citation

BibTex format

@inproceedings{Zhang:2021:10.1109/ICRA48506.2021.9561803,
author = {Zhang, D and Wang, R and Lo, B},
doi = {10.1109/ICRA48506.2021.9561803},
pages = {1350--1356},
publisher = {IEEE},
title = {Surgical gesture recognition based on bidirectional multi-layer independently RNN with explainable spatial feature extraction},
url = {http://dx.doi.org/10.1109/ICRA48506.2021.9561803},
year = {2021}
}

RIS format (EndNote, RefMan)

TY  - CPAPER
AB - Minimally invasive surgery mainly consists of a series of sub-tasks, which can be decomposed into basic gestures or contexts. As a prerequisite of autonomic operation, surgical gesture recognition can assist motion planning and decision-making, and build up context-aware knowledge to improve the surgical robot control quality. In this work, we aim to develop an effective surgical gesture recognition approach with an explainable feature extraction process. A Bidirectional Multi-Layer independently RNN (BML-indRNN) model is proposed in this paper, while spatial feature extraction is implemented via fine-tuning of a Deep Convolutional Neural Network (DCNN) model constructed based on the VGG architecture. To eliminate the black-box effects of DCNN, Gradient-weighted Class Activation Mapping (Grad-CAM) is employed. It can provide explainable results by showing the regions of the surgical images that have a strong relationship with the surgical gesture classification results. The proposed method was evaluated based on the suturing task with data obtained from the public available JIGSAWS database. Comparative studies were conducted to verify the pro-posed framework. Results indicated that the testing accuracy for the suturing task based on our proposed method is 87.13%,which outperforms most of the state-of-the-art algorithms
AU - Zhang,D
AU - Wang,R
AU - Lo,B
DO - 10.1109/ICRA48506.2021.9561803
EP - 1356
PB - IEEE
PY - 2021///
SP - 1350
TI - Surgical gesture recognition based on bidirectional multi-layer independently RNN with explainable spatial feature extraction
UR - http://dx.doi.org/10.1109/ICRA48506.2021.9561803
UR - http://hdl.handle.net/10044/1/88436
ER -