Haptic communication between humans and with robots

 Physical assistance enabled by haptic interaction is a fundamental modus for improving motor abilities, from a parent helping to guide their child during their first steps to a therapist supporting a patient. In research carried out with Ganesh Gowrishankar, Atsushi Takagi, LI Yanan and other colleagues, we investigate how human communicate through haptic cues when e.g. moving objects together, and how robots should help humans users to carry out such tasks.

Atsushi first showed that social factors influence force perception (Scientific Reports 2016). He further demonstrated that joint reaching movements are not appropriate to study haptic communication as both partners can largely use feedforward control to carry it out (PLoS ONE 2016). By examining the behaviours of two individuals when their right hands are physically connected, we could reveal how physical interaction with a partner changes one's own motor behaviour. In particular Ganesh and I could show that one improves with a better and even with a worse partner (Scientific Reports 2014), which suggests advantages of interactive paradigms for sport-training and physical rehabilitation.

Furthermore, with Atsushi we showed through computational modelling that haptic information provided by touch and proprioception enables one to estimate the partners movement plan and use it to improve one own motor performance (Nature Human Behaviour 2017). We experimentally verified our model by embodying it in a robot partner, and checked that it induces the same improvements in motor performance in a human individual as interacting with a human partner. Atsushi further elucidated how the interaction mechanics influence haptic communication (PLoS Computational Biology 2018) and iand illustrated the advantages of haptic communication with multiple partners (eLife 2019).

With Li Yanan, we are currently developing a game theory framework to explain these results and how humans modulate their haptic interaction (PLoS ONE 2012, Nature Machine Intelligence 2019). Altogether, these results provide important new insights into the neural mechanism of physical interactions in humans, and promise versatile collaborative robot systems for human-like assistance.

Related publications

  •  A Melendez, L Bagutti, B Pedrono and E Burdet (2011), A versatile dual-wrist device to study human-human interaction and bimanual control. Proc IEEE/RSJ Int Conf on Intelligent Robots and Systems (IROS) 2578-83.
  •  N Jarrasse, T Charalambous and E Burdet (2012), A Framework to describe, analyze and generate interactive motor behaviors. PLoS ONE 7(11): e49945.
  • G Ganesh, A Takagi, R Osu, T Yoshioka, M Kawato and E Burdet (2014), Two is better than one: Physical interactions improve motor performance in humans. Scientific Reports 4: 3824.
  • A Melendez-Calderon, V Komisar and E Burdet (2015), Interpersonal strategies for disturbance attenuation during a rhythmic joint motor action. Physiology and Behavior 147: 348-58.
  •  A Takagi, C Bagnato and E Burdet (2016), Facing the partner influences tit-for-tat exchanges in force. Scientific Reports 6: 35397.
  •  A Takagi, N Beckers and E Burdet (2016), Motion plan changes predictably in dyadic reaching. PLoS ONE 11.12.
  •  A Takagi, G Ganesh, T Yoshioka, M Kawato and E Burdet (2017), Physically interacting individuals estimate the partners goal to enhance their movements. Nature Human Behaviour 1: 0054.
  • A Takagi, F Usai, G Ganesh, V Sanguineti and E Burdet (2018), Haptic communication between humans is tuned by the hard or soft mechanics of interaction. PLoS Computational Biology 14(3): e1005971.
  • Y Li, G Carboni, F Gonzalez, D Campolo and E Burdet (2019), How a robot can understand and adapt to human action - differential game theory for versatile physical interaction. Nature Machine Intelligence 1(1): 36-43.
  • A Takagi, M Hirashima, D Nozaki and E Burdet (2019), Individuals physically interacting in a group rapidly coordinate their movement by estimating the collective goal. eLife 2019(8): e41328.