Personalized Robot-Assisted Dressing
RESEARCH SUMMARY: Assistive robots have the potential to provide tremendous support for disabled and elderly people in their daily dressing activities. Recent studies on robot-assisted dressing usually simplify the setup of the initial robot configuration by manually attaching the garments on the robot end-effector and positioning them close to the user's arm. A fundamental challenge in automating such a process for robots is computing suitable grasping points on garments that facilitate robotic manipulation. In this paper, we address this problem by introducing a supervised deep neural network to locate a pre-defined grasping point on the garment, using depth images for their invariance to color and texture. To reduce the amount of real data required, which is costly to collect, we leverage the power of simulation to produce large amounts of labeled data. The network is jointly trained with synthetic datasets of depth images and a limited amount of real data. We introduce a robot-assisted dressing system that combines the grasping point prediction method, with a grasping and manipulation strategy which takes grasping orientation computation and robot-garment collision avoidance into account. The experimental results demonstrate that our method is capable of yielding accurate grasping point estimations. The proposed dressing system enables the Baxter robot to autonomously grasp a hospital gown hung on a rail, bring it close to the user and successfully dress the upper-body.
RESEARCH SUMMARY: Robotic solutions to dressing assistance have the potential to provide tremendous support for elderly and disabled people. However, unexpected user movements may lead to dressing failures or even pose a risk to the user. Tracking such user movements with vision sensors is challenging due to severe visual occlusions created by the robot and clothes. We propose a probabilistic tracking method using Bayesian networks in latent spaces, which fuses robot end-effector positions and force information to enable camera-less and real-time estimation of the user postures during dressing. The latent spaces are created before dressing by modeling the user movements with a Gaussian Process Latent Variable Model, taking the user's movement limitations into account. We introduce a robot-assisted dressing system that combines our tracking method with hierarchical multi-task control to minimize the force between the user and the robot. The experimental results demonstrate the robustness and accuracy of our tracking method. The proposed method enables the Baxter robot to provide personalized dressing assistance in putting on a sleeveless jacket for users with (simulated) upper-body impairments.
- Zhang F, Demiris Y (2020). Data-Efficient Garment Grasping and Manipulation for Robot-Assisted Dressing. International Conference on Robotics and Automation (ICRA 2020).
- Zhang F, Cully A, Demiris Y (2019). Probabilistic Real-Time User Posture Tracking for Personalized Robot-Assisted Dressing. IEEE Transactions on Robotics, 35.4: 873-888.
- Zhang F, Cully A, Demiris Y (2017). Personalized Robot-assisted Dressing using User Modeling in Latent Spaces. International Conference on Intelligent Robots and Systems (IROS 2017), pp: 3603-3610.
- Gao Y, Chang HJ, Demiris Y (2016). , IEEE International Conference on Intelligent Robots and Systems (IROS 2016), pp:4398-4403.
- Gao Y, Chang HJ, Demiris Y (2015). , IEEE International Conference on Intelligent Robots and Systems (IROS 2015), pp:1840-1845.