Imperial College London

Dr Dandan Zhang

Faculty of EngineeringDepartment of Bioengineering

Lecturer in Artificial Intelligence & Machine Learning
 
 
 
//

Contact

 

d.zhang17 Website

 
 
//

Location

 

402Sir Michael Uren HubWhite City Campus

//

Summary

 

Publications

Citation

BibTex format

@article{Zhang:2020:10.1021/acsphotonics.0c00997,
author = {Zhang, D and Lo, FP-W and Zheng, J-Q and Bai, W and Yang, G-Z and Lo, B},
doi = {10.1021/acsphotonics.0c00997},
journal = {ACS Photonics},
pages = {3003--3014},
title = {Data-driven microscopic pose and depth estimation for optical microrobot manipulation},
url = {http://dx.doi.org/10.1021/acsphotonics.0c00997},
volume = {7},
year = {2020}
}

RIS format (EndNote, RefMan)

TY  - JOUR
AB - Optical microrobots have a wide range of applications in biomedical research for both in vitro and in vivo studies. In most microrobotic systems, the video captured by a monocular camera is the only way for visualizing the movements of microrobots, and only planar motion, in general, can be captured by a monocular camera system. Accurate depth estimation is essential for 3D reconstruction or autofocusing of microplatforms, while the pose and depth estimation are necessary to enhance the 3D perception of the microrobotic systems to enable dexterous micromanipulation and other tasks. In this paper, we propose a data-driven method for pose and depth estimation in an optically manipulated microrobotic system. Focus measurement is used to obtain features for Gaussian Process Regression (GPR), which enables precise depth estimation. For mobile microrobots with varying poses, a novel method is developed based on a deep residual neural network with the incorporation of prior domain knowledge about the optical microrobots encoded via GPR. The method can simultaneously track microrobots with complex shapes and estimate the pose and depth values of the optical microrobots. Cross-validation has been conducted to demonstrate the submicron accuracy of the proposed method and precise pose and depth perception for microrobots. We further demonstrate the generalizability of the method by adapting it to microrobots of different shapes using transfer learning with few-shot calibration. Intuitive visualization is provided to facilitate effective human-robot interaction during micromanipulation based on pose and depth estimation results.
AU - Zhang,D
AU - Lo,FP-W
AU - Zheng,J-Q
AU - Bai,W
AU - Yang,G-Z
AU - Lo,B
DO - 10.1021/acsphotonics.0c00997
EP - 3014
PY - 2020///
SN - 2330-4022
SP - 3003
TI - Data-driven microscopic pose and depth estimation for optical microrobot manipulation
T2 - ACS Photonics
UR - http://dx.doi.org/10.1021/acsphotonics.0c00997
UR - http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000592916800009&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=1ba7043ffcc86c417c072aa74d649202
UR - https://pubs.acs.org/doi/10.1021/acsphotonics.0c00997
UR - http://hdl.handle.net/10044/1/88045
VL - 7
ER -