Imperial College London

Dr Dandan Zhang

Faculty of EngineeringDepartment of Bioengineering

Lecturer in Artificial Intelligence & Machine Learning
 
 
 
//

Contact

 

d.zhang17 Website

 
 
//

Location

 

402Sir Michael Uren HubWhite City Campus

//

Summary

 

Publications

Publication Type
Year
to

40 results found

Fan W, Guo X, Feng E, Lin J, Wang Y, Liang J, Garrad M, Rossiter J, Zhang Z, Lepora N, Wei L, Zhang Det al., 2023, Digital Twin-Driven Mixed Reality Framework for Immersive Teleoperation With Haptic Rendering, IEEE Robotics and Automation Letters, Vol: 8, Pages: 8494-8501

Teleoperation has widely contributed to many applications. Consequently, the design of intuitive and ergonomic control interfaces for teleoperation has become crucial. The rapid advancement of Mixed Reality (MR) has yielded tangible benefits in human-robot interaction. MR provides an immersive environment for interacting with robots, effectively reducing the mental and physical workload of operators during teleoperation. Additionally, the incorporation of haptic rendering, including kinaesthetic and tactile rendering, could further amplify the intuitiveness and efficiency of MR-based immersive teleoperation. In this study, we developed an immersive, bilateral teleoperation system, integrating Digital Twin-driven Mixed Reality (DTMR) manipulation with haptic rendering. This system comprises a commercial remote controller with a kinaesthetic rendering feature and a wearable cost-effective tactile rendering interface, called the Soft Pneumatic Tactile Array (SPTA). We carried out two user studies to assess the system's effectiveness, including a performance evaluation of key components within DTMR and a quantitative assessment of the newly developed SPTA. The results demonstrate an enhancement in both the human-robot interaction experience and teleoperation performance.

Journal article

Lin Y, Church A, Yang M, Li H, Lloyd J, Zhang D, Lepora NFet al., 2023, Bi-Touch: Bimanual Tactile Manipulation With Sim-to-Real Deep Reinforcement Learning, IEEE ROBOTICS AND AUTOMATION LETTERS, Vol: 8, Pages: 5472-5479, ISSN: 2377-3766

Journal article

Yang M, Lin Y, Church A, Lloyd J, Zhang D, Barton DAW, Lepora NFet al., 2023, Sim-to-Real Model-Based and Model-Free Deep Reinforcement Learning for Tactile Pushing, IEEE ROBOTICS AND AUTOMATION LETTERS, Vol: 8, Pages: 5480-5487, ISSN: 2377-3766

Journal article

He Z, Zhang X, Jones S, Hauert S, Zhang D, Lepora NFet al., 2023, TacMMs: Tactile Mobile Manipulators for Warehouse Automation, IEEE ROBOTICS AND AUTOMATION LETTERS, Vol: 8, Pages: 4729-4736, ISSN: 2377-3766

Journal article

Zhang D, Gorochowski TE, Marucci L, Lee H-T, Gil B, Li B, Hauert S, Yeatman Eet al., 2023, Advanced medical micro-robotics for early diagnosis and therapeutic interventions, Frontiers in Robotics and AI, Vol: 9, Pages: 1-19, ISSN: 2296-9144

Recent technological advances in micro-robotics have demonstrated their immense potential for biomedical applications. Emerging micro-robots have versatile sensing systems, flexible locomotion and dexterous manipulation capabilities that can significantly contribute to the healthcare system. Despite the appreciated and tangible benefits of medical micro-robotics, many challenges still remain. Here, we review the major challenges, current trends and significant achievements for developing versatile and intelligent micro-robotics with a focus on applications in early diagnosis and therapeutic interventions. We also consider some recent emerging micro-robotic technologies that employ synthetic biology to support a new generation of living micro-robots. We expect to inspire future development of micro-robots toward clinical translation by identifying the roadblocks that need to be overcome.

Journal article

Fan W, Yang M, Xing Y, Lepora NF, Zhang Det al., 2023, Tac-VGNN: A Voronoi Graph Neural Network for Pose-Based Tactile Servoing, Pages: 10373-10379, ISSN: 1050-4729

Tactile pose estimation and tactile servoing are fundamental capabilities of robot touch. Reliable and precise pose estimation can be provided by applying deep learning models to high-resolution optical tactile sensors. Given the recent successes of Graph Neural Network (GNN) and the effectiveness of Voronoi features, we developed a Tactile Voronoi Graph Neural Network (Tac-VGNN) to achieve reliable pose-based tactile servoing relying on a biomimetic optical tactile sensor (TacTip). The GNN is well suited to modeling the distribution relationship between shear motions of the tactile markers, while the Voronoi diagram supplements this with area-based tactile features related to contact depth. The experiment results showed that the Tac-VGNN model can help enhance data interpretability during graph generation and model training efficiency significantly than CNN-based methods. It also improved pose estimation accuracy along vertical depth by 28.57% over vanilla GNN without Voronoi features and achieved better performance on the real surface following tasks with smoother robot control trajectories. For more project details, please view our website: https://sites.google.com/view/tac-vgnn/home

Conference paper

Zhang D, Fan W, Lloyd J, Yang C, Lepora NFFet al., 2022, One-Shot Domain-Adaptive Imitation Learning via Progressive Learning Applied to Robotic Pouring, IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, ISSN: 1545-5955

Journal article

Zhang D, Ren Y, Barbot A, Seichepine F, Lo B, Ma Z-C, Yang G-Zet al., 2022, Fabrication and optical manipulation of micro-robots for biomedical applications, MATTER, Vol: 5, Pages: 3135-3160, ISSN: 2590-2393

Journal article

Zhang D, Li Q, Zheng Y, Wei L, Zhang D, Zhang Zet al., 2022, Explainable Hierarchical Imitation Learning for Robotic Drink Pouring, IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, Vol: 19, Pages: 3871-3887, ISSN: 1545-5955

Journal article

Zhang D, Si W, Fan W, Guan Y, Yang Cet al., 2022, From Teleoperation to Autonomous Robot-assisted Microsurgery: A Survey, MACHINE INTELLIGENCE RESEARCH, Vol: 19, Pages: 288-306, ISSN: 2731-538X

Journal article

Yunxiao R, Keshavarz M, Salzitsa A, Ghazal H, Benny L, Dandan Zet al., 2022, Machine Learning-Based Real-Time Localisation and Automatic Trapping of Multiple Microrobots in Optical Tweezer, International Conference on Manipulation, Automation and Robotics at Small Scales (MARSS2022)

Conference paper

Zhang D, Wu Z, Chen J, Zhu R, Munawar A, Xiao B, Guan Y, Su H, Hong W, Guo Y, Fischer GS, Lo B, Yang G-Zet al., 2022, Human-robot shared control for surgical robot based on context-aware sim-to-real adaptation, 2022 IEEE International Conference on Robotics and Automation (ICRA), Publisher: IEEE, Pages: 7701-7707

Human-robot shared control, which integrates the advantages of both humans and robots, is an effective approach to facilitate efficient surgical operation. Learning from demonstration (LfD) techniques can be used to automate some of the surgical sub tasks for the construction of the shared control mechanism. However, a sufficient amount of data is required for the robot to learn the manoeuvres. Using a surgical simulator to collect data is a less resource-demanding approach. With sim-to-real adaptation, the manoeuvres learned from a simulator can be transferred to a physical robot. To this end, we propose a sim-to-real adaptation method to construct a human-robot shared control framework for robotic surgery. In this paper, a desired trajectory is generated from a simulator using LfD method, while dynamic motion primitives (DMP) is used to transfer the desired trajectory from the simulator to the physical robotic platform. Moreover, a role adaptation mechanism is developed such that the robot can adjust its role according to the surgical operation contexts predicted by a neural network model. The effectiveness of the proposed framework is validated on the da Vinci Research Kit (dVRK). Results of the user studies indicated that with the adaptive human-robot shared control framework, the path length of the remote controller, the total clutching number and the task completion time can be reduced significantly. The proposed method outperformed the traditional manual control via teleoperation.

Conference paper

Su H, Qi W, Chen J, Zhang Det al., 2022, Fuzzy Approximation-Based Task-Space Control of Robot Manipulators With Remote Center of Motion Constraint, IEEE TRANSACTIONS ON FUZZY SYSTEMS, Vol: 30, Pages: 1564-1573, ISSN: 1063-6706

Journal article

Zhang D, Barbot A, Seichepine F, Lo FP-W, Bai W, Yang G-Z, Lo Bet al., 2022, Micro-object pose estimation with sim-to-real transfer learning using small dataset, Communications Physics, Vol: 5, ISSN: 2399-3650

Journal article

Fan W, Bo H, Lin Y, Xing Y, Liu W, Lepora N, Zhang Det al., 2022, Graph Neural Networks for Interpretable Tactile Sensing

Fine-grained tactile perception of objects is significant for robots to explore the unstructured environment. Recent years have seen the success of Convolutional Neural Networks (CNNs)-based methods for tactile perception using high-resolution optical tactile sensors. However, CNNs-based approaches may not be efficient for processing tactile image data and have limited interpretability. To this end, we propose a Graph Neural Network (GNN)-based approach for tactile recognition using a soft biomimetic optical tactile sensor. The obtained tactile images can be transformed into graphs, while GNN can be used to analyse the implicit tactile information among the tactile graphs. The experimental results indicate that with the proposed GNN-based method, the maximum tactile recognition accuracy can reach 99.53%. In addition, Gradient-weighted Class Activation Mapping (Grad-CAM) and Unsigned Grad-CAM (UGrad-CAM) methods are used for visual explanations of the models. Compared to traditional CNNs, we demonstrated that the generated features of the GNN-based model are more intuitive and interpretable.

Conference paper

Li W, Zhang D, Yang G-Z, Lo Bet al., 2022, Design and Modelling of A Spring-Like Continuum Joint with Variable Pitch for Endoluminal Surgery, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Publisher: IEEE, Pages: 41-47, ISSN: 2153-0858

Conference paper

Zhang D, Wang R, Lo B, 2021, Surgical gesture recognition based on bidirectional multi-layer independently RNN with explainable spatial feature extraction, IEEE International Conference on Robotics and Automation (ICRA) 2021, Publisher: IEEE, Pages: 1350-1356

Minimally invasive surgery mainly consists of a series of sub-tasks, which can be decomposed into basic gestures or contexts. As a prerequisite of autonomic operation, surgical gesture recognition can assist motion planning and decision-making, and build up context-aware knowledge to improve the surgical robot control quality. In this work, we aim to develop an effective surgical gesture recognition approach with an explainable feature extraction process. A Bidirectional Multi-Layer independently RNN (BML-indRNN) model is proposed in this paper, while spatial feature extraction is implemented via fine-tuning of a Deep Convolutional Neural Network (DCNN) model constructed based on the VGG architecture. To eliminate the black-box effects of DCNN, Gradient-weighted Class Activation Mapping (Grad-CAM) is employed. It can provide explainable results by showing the regions of the surgical images that have a strong relationship with the surgical gesture classification results. The proposed method was evaluated based on the suturing task with data obtained from the public available JIGSAWS database. Comparative studies were conducted to verify the pro-posed framework. Results indicated that the testing accuracy for the suturing task based on our proposed method is 87.13%,which outperforms most of the state-of-the-art algorithms

Conference paper

Gao A, Murphy RR, Chen W, Dagnino G, Fischer P, Gutierrez MG, Kundrat D, Nelson BJ, Shamsudhin N, Su H, Xia J, Zemmar A, Zhang D, Wang C, Yang G-Zet al., 2021, Progress in robotics for combating infectious diseases, SCIENCE ROBOTICS, Vol: 6, ISSN: 2470-9476

Journal article

Payne CJ, Vyas K, Bautista-Salinas D, Zhang D, Marcus HJ, Yang GZet al., 2021, Shared-control robots, Neuromethods, Pages: 63-79

This chapter reviews shared-control robots, a class of robotic device in which the surgeon and the robot simultaneously manipulate the surgical tool together. The shared-control approach seeks to exploit the superior aspects of humans and machines, to enable more precise interventions while ensuring the human surgeon retains executive control. Much of the technology discussed in this chapter is emerging research and many of the described systems have been developed for generic microsurgical interventions. Nonetheless, the broad concepts behind these surgical systems are highly applicable to neurosurgery and particularly to microsurgical procedures. We start by presenting an exemplar of a grounded, shared-control robot: the Steady-Hand system. We then review a series of handheld smart surgical devices, including Micron, a handheld tremor cancellation device. This chapter also presents handheld devices capable of augmenting haptic feedback to surgeons performing delicate neurosurgical tasks, image-guided handheld devices with embedded robotic actuation, and a new generation of handheld microscopic imaging devices for visualizing tumors.

Book chapter

Wang R, Zhang D, Li Q, Xiao-Yun Z, Lo Bet al., 2021, Real-time Surgical Environment Enhancement for Robot-Assisted Minimally Invasive Surgery Based on Super-Resolution, IEEE International Conference on Robotics and Automation (ICRA), Publisher: IEEE, Pages: 3434-3440, ISSN: 1050-4729

Conference paper

Chen J, Zhang D, Munawar A, Zhu R, Lo B, Fischer GS, Yang G-Zet al., 2020, Supervised semi-autonomous control for surgical robot based on Banoian optimization, 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Publisher: IEEE, Pages: 2943-2949

The recent development of Robot-Assisted Minimally Invasive Surgery (RAMIS) has brought much benefit to ease the performance of complex Minimally Invasive Surgery (MIS) tasks and lead to more clinical outcomes. Compared to direct master-slave manipulation, semi-autonomous control for the surgical robot can enhance the efficiency of the operation, particularly for repetitive tasks. However, operating in a highly dynamic in-vivo environment is complex. Supervisory control functions should be included to ensure flexibility and safety during the autonomous control phase. This paper presents a haptic rendering interface to enable supervised semi-autonomous control for a surgical robot. Bayesian optimization is used to tune user-specific parameters during the surgical training process. User studies were conducted on a customized simulator for validation. Detailed comparisons are made between with and without the supervised semi-autonomous control mode in terms of the number of clutching events, task completion time, master robot end-effector trajectory and average control speed of the slave robot. The effectiveness of the Bayesian optimization is also evaluated, demonstrating that the optimized parameters can significantly improve users' performance. Results indicate that the proposed control method can reduce the operator's workload and enhance operation efficiency.

Conference paper

Zhang D, Lo FP-W, Zheng J-Q, Bai W, Yang G-Z, Lo Bet al., 2020, Data-driven microscopic pose and depth estimation for optical microrobot manipulation, ACS Photonics, Vol: 7, Pages: 3003-3014, ISSN: 2330-4022

Optical microrobots have a wide range of applications in biomedical research for both in vitro and in vivo studies. In most microrobotic systems, the video captured by a monocular camera is the only way for visualizing the movements of microrobots, and only planar motion, in general, can be captured by a monocular camera system. Accurate depth estimation is essential for 3D reconstruction or autofocusing of microplatforms, while the pose and depth estimation are necessary to enhance the 3D perception of the microrobotic systems to enable dexterous micromanipulation and other tasks. In this paper, we propose a data-driven method for pose and depth estimation in an optically manipulated microrobotic system. Focus measurement is used to obtain features for Gaussian Process Regression (GPR), which enables precise depth estimation. For mobile microrobots with varying poses, a novel method is developed based on a deep residual neural network with the incorporation of prior domain knowledge about the optical microrobots encoded via GPR. The method can simultaneously track microrobots with complex shapes and estimate the pose and depth values of the optical microrobots. Cross-validation has been conducted to demonstrate the submicron accuracy of the proposed method and precise pose and depth perception for microrobots. We further demonstrate the generalizability of the method by adapting it to microrobots of different shapes using transfer learning with few-shot calibration. Intuitive visualization is provided to facilitate effective human-robot interaction during micromanipulation based on pose and depth estimation results.

Journal article

Zhang D, Barbot A, Lo B, Yang G-Zet al., 2020, Distributed force control for microrobot manipulation via planar multi-spot optical tweezer, Advanced Optical Materials, Vol: 8, Pages: 1-15, ISSN: 2195-1071

Optical tweezers (OT) represent a versatile tool for micro‐manipulation. To avoid damages to living cells caused by illuminating laser directly on them, microrobots controlled by OT can be used for manipulation of cells or living organisms in microscopic scale. Translation and planar rotation motion of microrobots can be realized by using a multi‐spot planar OT. However, out‐of‐plane manipulation of microrobots is difficult to achieve with a planar OT. This paper presents a distributed manipulation scheme based on multiple laser spots, which can control the out‐of‐plane pose of a microrobot along multiple axes. Different microrobot designs have been investigated and fabricated for experimental validation. The main contributions of this paper include: i) development of a generic model for the structure design of microrobots which enables multi‐dimensional (6D) control via conventional multi‐spot OT; ii) introduction of the distributed force control for microrobot manipulation based on characteristic distance and power intensity distribution. Experiments are performed to demonstrate the effectiveness of the proposed method and its potential applications, which include indirect manipulation of micro‐objects.

Journal article

Zhang D, Wu Z, Chen J, Gao A, Chen X, Li P, Wang Z, Yang G, Lo B, Yang G-Zet al., 2020, Automatic microsurgical skill assessment based on cross-domain transfer learning, IEEE Robotics and Automation Letters, Vol: 5, Pages: 4148-4155, ISSN: 2377-3766

The assessment of microsurgical skills for Robot-Assisted Microsurgery (RAMS) still relies primarily on subjective observations and expert opinions. A general and automated evaluation method is desirable. Deep neural networks can be used for skill assessment through raw kinematic data, which has the advantages of being objective and efficient. However, one of the major issues of deep learning for the analysis of surgical skills is that it requires a large database to train the desired model, and the training process can be time-consuming. This letter presents a transfer learning scheme for training a model with limited RAMS datasets for microsurgical skill assessment. An in-house Microsurgical Robot Research Platform Database (MRRPD) is built with data collected from a microsurgical robot research platform (MRRP). It is used to verify the proposed cross-domain transfer learning for RAMS skill level assessment. The model is fine-tuned after training with the data obtained from the MRRP. Moreover, microsurgical tool tracking is developed to provide visual feedback while task-specific metrics and the other general evaluation metrics are provided to the operator as a reference. The method proposed has shown to offer the potential to guide the operator to achieve a higher level of skills for microsurgical operation.

Journal article

Zhang D, Liu J, Gao A, Yang G-Zet al., 2020, An Ergonomic Shared Workspace Analysis Framework for the Optimal Placement of a Compact Master Control Console, IEEE ROBOTICS AND AUTOMATION LETTERS, Vol: 5, Pages: 2995-3002, ISSN: 2377-3766

Journal article

Zhang D, Liu J, Zhang L, Yang G-Zet al., 2020, Hamlyn CRM: a compact master manipulator for surgical robot remote control, INTERNATIONAL JOURNAL OF COMPUTER ASSISTED RADIOLOGY AND SURGERY, Vol: 15, Pages: 503-514, ISSN: 1861-6410

Journal article

Zhang D, Chen J, Li W, Salinas DB, Yang G-Zet al., 2020, A microsurgical robot research platform for robot-assisted microsurgery research and training, INTERNATIONAL JOURNAL OF COMPUTER ASSISTED RADIOLOGY AND SURGERY, Vol: 15, Pages: 15-25, ISSN: 1861-6410

Journal article

Lu B, Chen W, Jin Y-M, Zhang D, Dou Q, Chu HK, Heng P-A, Liu Y-Het al., 2020, A Learning-Driven Framework with Spatial Optimization For Surgical Suture Thread Reconstruction and Autonomous Grasping Under Multiple Topologies and Environmental Noises, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Publisher: IEEE, Pages: 3075-3082, ISSN: 2153-0858

Conference paper

Shen Y, Lai W, Zhang D, Wan R, Hong Ket al., 2019, The Ubiquitin-Like Protein Fat10 Attenuates Cardiac Hypertrophy by Promoting the Lysosomal Degradation of CaMK2D, Scientific Sessions of the American-Heart-Association, Publisher: LIPPINCOTT WILLIAMS & WILKINS, ISSN: 0009-7322

Conference paper

Zhang D, Cursi F, Yang G-Z, 2019, WSRender: A Workspace Analysis and Visualization Toolbox for Robotic Manipulator Design and Verification, IEEE ROBOTICS AND AUTOMATION LETTERS, Vol: 4, Pages: 3836-3843, ISSN: 2377-3766

Journal article

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.

Request URL: http://wlsprd.imperial.ac.uk:80/respub/WEB-INF/jsp/search-html.jsp Request URI: /respub/WEB-INF/jsp/search-html.jsp Query String: respub-action=search.html&id=01416087&limit=30&person=true