Imperial College London

Professor Yiannis Demiris

Faculty of EngineeringDepartment of Electrical and Electronic Engineering

Professor of Human-Centred Robotics, Head of ISN
 
 
 
//

Contact

 

+44 (0)20 7594 6300y.demiris Website

 
 
//

Location

 

1011Electrical EngineeringSouth Kensington Campus

//

Summary

 

Publications

Citation

BibTex format

@inproceedings{Zhang:2020:10.1109/icra40945.2020.9196994,
author = {Zhang, F and Demiris, Y},
doi = {10.1109/icra40945.2020.9196994},
pages = {9114--9120},
publisher = {IEEE},
title = {Learning grasping points for garment manipulation in robot-assisted dressing},
url = {http://dx.doi.org/10.1109/icra40945.2020.9196994},
year = {2020}
}

RIS format (EndNote, RefMan)

TY  - CPAPER
AB - Assistive robots have the potential to provide tremendous support for disabled and elderly people in their daily dressing activities. Recent studies on robot-assisted dressing usually simplify the setup of the initial robot configuration by manually attaching the garments on the robot end-effector and positioning them close to the user's arm. A fundamental challenge in automating such a process for robots is computing suitable grasping points on garments that facilitate robotic manipulation. In this paper, we address this problem by introducing a supervised deep neural network to locate a predefined grasping point on the garment, using depth images for their invariance to color and texture. To reduce the amount of real data required, which is costly to collect, we leverage the power of simulation to produce large amounts of labeled data. The network is jointly trained with synthetic datasets of depth images and a limited amount of real data. We introduce a robot-assisted dressing system that combines the grasping point prediction method, with a grasping and manipulation strategy which takes grasping orientation computation and robot-garment collision avoidance into account. The experimental results demonstrate that our method is capable of yielding accurate grasping point estimations. The proposed dressing system enables the Baxter robot to autonomously grasp a hospital gown hung on a rail, bring it close to the user and successfully dress the upper-body.
AU - Zhang,F
AU - Demiris,Y
DO - 10.1109/icra40945.2020.9196994
EP - 9120
PB - IEEE
PY - 2020///
SP - 9114
TI - Learning grasping points for garment manipulation in robot-assisted dressing
UR - http://dx.doi.org/10.1109/icra40945.2020.9196994
UR - https://ieeexplore.ieee.org/document/9196994
UR - http://hdl.handle.net/10044/1/83952
ER -