Imperial College London

ProfessorEtienneBurdet

Faculty of EngineeringDepartment of Bioengineering

Professor of Human Robotics
 
 
 
//

Contact

 

e.burdet CV

 
 
//

Location

 

4.05Royal School of MinesSouth Kensington Campus

//

Summary

 

Publications

Publication Type
Year
to

238 results found

Mehring C, Akselrod M, Bashford L, Mace M, Choi H, Blüher M, Buschhoff A-S, Pistohl T, Salomon R, Cheah A, Blanke O, Serino A, Burdet Eet al., 2019, Augmented manipulation ability in humans with six-fingered hands, Nature Communications, Vol: 10, Pages: 2401-2401, ISSN: 2041-1723

Neurotechnology attempts to develop supernumerary limbs, but can the human brain deal with the complexity to control an extra limb and yield advantages from it? Here, we analyzed the neuromechanics and manipulation abilities of two polydactyly subjects who each possess six fingers on their hands. Anatomical MRI of the supernumerary finger (SF) revealed that it is actuated by extra muscles and nerves, and fMRI identified a distinct cortical representation of the SF. In both subjects, the SF was able to move independently from the other fingers. Polydactyly subjects were able to coordinate the SF with their other fingers for more complex movements than five fingered subjects, and so carry out with only one hand tasks normally requiring two hands. These results demonstrate that a body with significantly more degrees-of-freedom can be controlled by the human nervous system without causing motor deficits or impairments and can instead provide superior manipulation abilities.

Journal article

Farkhatdinov I, Ebert J, van Oort G, Vlutters M, van Asseldonk E, Burdet Eet al., 2019, Assisting human balance in standing with a robotic exoskeleton, IEEE Robotics and Automation Letters, Vol: 4, Pages: 414-421, ISSN: 2377-3766

This letter presents an experimental study on balance recovery control with a lower limb exoskeleton robot. Four participants were subjected to a perturbation during standing, a forward force impulse applied to their pelvis that forced them to step forward with the right leg for balance recovery. Trials with and without exoskeleton assistance to move the stepping legs thigh were conducted to investigate the influence of the exoskeletons control assistance on balancing performance and a potential adaptation. Analysis of the body kinematics and muscle activation demonstrates that robotic assistance: first, was easy to use and did not require learning, nor inhibited the healthy stepping behavior; second, it modified the stepping leg trajectories by increasing hip and knee movement; third, increased reaction speed and decreased the step duration; and finally, generally increased biceps femoris and rectus femoris muscle activity.

Journal article

Takagi A, Hiroshima M, Nozaki D, Burdet Eet al., 2019, Individuals physically interacting in a group rapidly coordinate their movement by estimating the collective goal, eLife, Vol: 8, ISSN: 2050-084X

How can a human collective coordinate, for example to move a banquet table, wheneach person is influenced by the inertia of others who may be inferior at the task? We hypothesizedthat large groups cannot coordinate through touch alone, accruing to a zero-sum scenario whereindividuals inferior at the task hinder superior ones. We tested this hypothesis by examining howdyads, triads and tetrads, whose right hands were physically coupled together, followed a commonmoving target. Surprisingly, superior individuals followed the target accurately even when coupledto an inferior group, and the interaction benefits increased with the group size. A computationalmodel shows that these benefits arose as each individual uses their respective interaction force toinfer the collective’s target and enhance their movement planning, which permitted coordination inseconds independent of the collective’s size. By estimating the collective’s movement goal, itsindividuals make physical interaction beneficial, swift and scalable.

Journal article

Li Y, Carboni G, Gonzalez F, Campolo D, Burdet Eet al., 2019, Differential game theory for versatile physical human–robot interaction, Nature Machine Intelligence, Vol: 1, Pages: 36-43, ISSN: 2522-5839

The last decades have seen a surge of robots working in contact with humans. However, until now these contact robots have made little use of the opportunities offered by physical interaction and lack a systematic methodology to produce versatile behaviours. Here, we develop an interactive robot controller able to understand the control strategy of the human user and react optimally to their movements. We demonstrate that combining an observer with a differential game theory controller can induce a stable interaction between the two partners, precisely identify each other’s control law, and allow them to successfully perform the task with minimum effort. Simulations and experiments with human subjects demonstrate these properties and illustrate how this controller can induce different representative interaction strategies.

Journal article

Mutalib SA, Mace M, Burdet E, 2019, Bimanual coordination during a physically coupled task in unilateral spastic cerebral palsy children, Journal of NeuroEngineering and Rehabilitation, Vol: 16, ISSN: 1743-0003

BackgroundSingle object bimanual manipulation, or physically-coupled bimanual tasks, are ubiquitous in daily lives. However, the predominant focus of previous studies has been on uncoupled bimanual actions, where the two hands act independently to manipulate two disconnected objects. In this paper, we explore interlimb coordination among children with unilateral spastic cerebral palsy (USCP), by investigating upper limb motor control during a single object bimanual lifting task.Methods15 children with USCP and 17 typically developing (TD) children performed a simple single-object bimanual lifting task. The object was an instrumented cube that can record the contact force on each of its faces alongside estimating its trajectory during a prescribed two-handed lifting motion. The subject’s performance was measured in terms of the duration of individual phases, linearity and monotonicity of the grasp-to-load force synergy, interlimb force asymmetry, and movement smoothness.ResultsSimilar to their TD counterparts, USCP subjects were able to produce a linear grasp-to-load force synergy. However, they demonstrated difficulties in producing monotonic forces and generating smooth movements. No impairment of anticipatory control was observed within the USCP subjects. However, our analysis showed that the USCP subjects shifted the weight of the cube onto their more-abled side, potentially to minimise the load on the impaired side, which suggests a developed strategy of compensating for inter-limb asymmetries, such as muscle strength.ConclusionBimanual interaction with a single mutual object has the potential to facilitate anticipation and sequencing of force control in USCP children unlike previous studies which showed deficits during uncoupled bimanual actions. We suggest that this difference could be partly due to the provision of adequate cutaneous and kinaesthetic information gathered from the dynamic exchange of forces between the two hands, mediated through the phy

Journal article

van der Kooij H, van Asseldonk E, van Oort G, Sluiter V, Emmens A, Witteveen H, Tagliamonte NL, Tamburella F, Pisotta I, Masciullo M, Arquilla M, Molinari M, Wu A, Ijspeert A, Dzeladini FF, Thorsteinsson F, Arami A, Burdet E, Huang HY, Gregoor W, Meijneke Cet al., 2019, Symbitron: Symbiotic man-machine interactions in wearable exoskeletons to enhance mobility for paraplegics, Biosystems and Biorobotics, Pages: 361-364

© Springer Nature Switzerland AG 2019. The main goal of the Symbitron project was to develop a safe, bio-inspired, personalized wearable exoskeleton that enables SCI patients to walk without additional assistance, by complementing their remaining motor function. Here we give an overview of major achievements of the projects.

Book chapter

Balasubramanian S, Garcia-Cossio E, Birbaumer N, Burdet E, Ramos-Murguialday Aet al., 2018, Is EMG a viable alternative to BCI for detecting movement intention in severe stroke?, IEEE Transactions on Biomedical Engineering, Vol: 65, Pages: 2790-2797, ISSN: 0018-9294

Objective: In light of the shortcomings of current restorative brain-computer interfaces (BCI), this study investigated the possibility of using EMG to detect hand/wrist extension movement intention to trigger robot-assisted training in individuals without residual movements. Methods: We compared movement intention detection using an EMG detector with a sensorimotor rhythm based EEG-BCI using only ipsilesional activity. This was carried out on data of 30 severely affected chronic stroke patients from a randomized control trial using an EEG-BCI for robot-assisted training. Results: The results indicate the feasibility of using EMG to detect movement intention in this severely handicapped population; probability of detecting EMG when patients attempted to move was higher (p <; 0.001) than at rest. Interestingly, 22 out of 30 (or 73%) patients had sufficiently strong EMG in their finger/wrist extensors. Furthermore, in patients with detectable EMG, there was poor agreement between the EEG and EMG intent detectors, which indicates that these modalities may detect different processes. Conclusion : A substantial segment of severely affected stroke patients may benefit from EMG-based assisted therapy. When compared to EEG, a surface EMG interface requires less preparation time, which is easier to don/doff, and is more compact in size. Significance: This study shows that a large proportion of severely affected stroke patients have residual EMG, which yields a direct and practical way to trigger robot-assisted training.

Journal article

Donadio A, Whitehead K, Gonzalez F, Wilhelm E, Formica D, Meek J, Fabrizi L, Burdet Eet al., 2018, A novel sensor design for accurate measurement of facial somatosensation in pre-term infants, PLoS ONE, Vol: 13, ISSN: 1932-6203

Facial somatosensory feedback is critical for breastfeeding in the first days of life. However, its development has never been investigated in humans. Here we develop a new interface to measure facial somatosensation in newborn infants. The novel system allows to measure neuronal responses to touching the face of the subject by synchronously recording scalp electroencephalography (EEG) and the force applied by the experimenter. This is based on a dedicated force transducer that can be worn on the finger underneath a clinical nitrile glove and linked to a commercial EEG acquisition system. The calibrated device measures the pressure applied by the investigator when tapping the skin concurrently with the resulting brain response. With this system, we were able to demonstrate that taps of 192 mN (mean) reliably elicited facial somatosensory responses in 7 pre-term infants. These responses had a time course similar to those following limbs stimulation, but more lateral topographical distribution consistent with body representations in primary somatosensory areas. The method introduced can therefore be used to reliably measure facial somatosensory responses in vulnerable infants.

Journal article

Borzelli D, Cesqui B, Berger DJ, Burdet E, d'Avella Aet al., 2018, Muscle patterns underlying voluntary modulation of co-contraction, PLoS ONE, Vol: 13, ISSN: 1932-6203

Manipulative actions involving unstable interactions with the environment require controlling mechanical impedance through muscle co-contraction. While much research has focused on how the central nervous system (CNS) selects the muscle patterns underlying a desired movement or end-point force, the coordination strategies used to achieve a desired end-point impedance have received considerably less attention. We recorded isometric forces at the hand and electromyographic (EMG) signals in subjects performing a reaching task with an external disturbance. In a virtual environment, subjects displaced a cursor by applying isometric forces and were instructed to reach targets in 20 spatial locations. The motion of the cursor was then perturbed by disturbances whose effects could be attenuated by increasing co-contraction. All subjects could voluntarily modulate co-contraction when disturbances of different magnitudes were applied. For most muscles, activation was modulated by target direction according to a cosine tuning function with an offset and an amplitude increasing with disturbance magnitude. Co-contraction was characterized by projecting the muscle activation vector onto the null space of the EMG-to-force mapping. Even in the baseline the magnitude of the null space projection was larger than the minimum magnitude required for non-negative muscle activations. Moreover, the increase in co-contraction was not obtained by scaling the baseline null space projection, scaling the difference between the null space projections in any block and the projection of the non-negative minimum-norm muscle vector, or scaling the difference between the null space projections in the perturbed blocks and the baseline null space projection. However, the null space projections in the perturbed blocks were obtained by linear combination of the baseline null space projection and the muscle activation used to increase co-contraction without generating any force. The failure of scaling rul

Journal article

Li Y, Ganesh G, Jarrasse N, Haddadin S, Albu-Schaeffer A, Burdet Eet al., 2018, Force, impedance, and trajectory learning for contact tooling and haptic identification, IEEE Transactions on Robotics, Vol: 34, Pages: 1170-1182, ISSN: 1552-3098

Humans can skilfully use tools and interact with the environment by adapting their movement trajectory, contact force, and impedance. Motivated by the human versatility, we develop here a robot controller that concurrently adapts feedforward force, impedance, and reference trajectory when interacting with an unknown environment. In particular, the robot's reference trajectory is adapted to limit the interaction force and maintain it at a desired level, while feedforward force and impedance adaptation compensates for the interaction with the environment. An analysis of the interaction dynamics using Lyapunov theory yields the conditions for convergence of the closed-loop interaction mediated by this controller. Simulations exhibit adaptive properties similar to human motor adaptation. The implementation of this controller for typical interaction tasks including drilling, cutting, and haptic exploration shows that this controller can outperform conventional controllers in contact tooling.

Journal article

ogrinc M, Farkhatdinov I, Walker R, Burdet Eet al., 2018, Sensory integration of apparent motion speed and vibration magnitude, IEEE Transactions on Haptics, Vol: 11, Pages: 455-463, ISSN: 1939-1412

Tactile apparent motion can display directional information in an intuitive way. It can for example be used to give directions to visually impaired individuals, or for waypoint navigation while cycling on busy streets, when vision or audition should not be loaded further. However, although humans can detect very short tactile patterns, discriminating between similar motion speeds has been shown to be difficult. Here we develop and investigate a method where the speed of tactile apparent motion around the user & #x0027;s wrist is coupled with vibration magnitude. This redundant coupling is used to produce tactile patterns from slow & amp;weak to fast & amp;strong. We compared the just noticeable difference (JND) of the coupled and the individual variables. The results show that the perception of the coupled variable can be characterised by JND smaller than JNDs of the individual variables. This allowed us to create short tactile pattens (tactons) for display of direction and speed, which can be distinguished significantly better than tactons based on motion alone. Additionally, most subjects were also able to identify the coupled-variable tactons better than the magnitude-based tactons.

Journal article

Dall'Orso S, Steinweg J, Allievi AG, Edwards AD, Burdet E, Arichi Tet al., 2018, Somatotopic mapping of the developing sensorimotor cortex in the preterm human brain, Cerebral Cortex, Vol: 28, Pages: 2507-2515, ISSN: 1047-3211

In the mature mammalian brain, the primary somatosensory and motor cortices are known to be spatially organized such that neural activity relating to specific body parts can be somatopically mapped onto an anatomical "homunculus". This organization creates an internal body representation which is fundamental for precise motor control, spatial awareness and social interaction. Although it is unknown when this organization develops in humans, animal studies suggest that it may emerge even before the time of normal birth. We therefore characterized the somatotopic organization of the primary sensorimotor cortices using functional MRI and a set of custom-made robotic tools in 35 healthy preterm infants aged from 31 + 6 to 36 + 3 weeks postmenstrual age. Functional responses induced by somatosensory stimulation of the wrists, ankles, and mouth had a distinct spatial organization as seen in the characteristic mature homunculus map. In comparison to the ankle, activation related to wrist stimulation was significantly larger and more commonly involved additional areas including the supplementary motor area and ipsilateral sensorimotor cortex. These results are in keeping with early intrinsic determination of a somatotopic map within the primary sensorimotor cortices. This may explain why acquired brain injury in this region during the preterm period cannot be compensated for by cortical reorganization and therefore can lead to long-lasting motor and sensory impairment.

Journal article

Ogrinc Ms M, Farkhatdinov PhD I, Walker Ms R, Burdet Eet al., 2018, Horseback riding therapy for a deafblind individual enabled by a haptic interface, Assistive Technology: The Offical Journal of RESNA, Vol: 30, Pages: 143-150, ISSN: 1949-3614

We present a haptic interface to help deafblind people to practice horseback riding as a recreational and therapeutic activity. Horseback riding is a form of therapy which can improve self-esteem and sensation of independence. It has been shown to benefit people with various medical conditions-including autism. However, in the case of deafblind riders, an interpreter must stand by at all times to communicate with the rider by touch. We developed a simple interface that enables deafblind people to enjoy horseback riding while the instructor is remotely providing cues, which improves their independence. Experiments demonstrated that an autistic deafblind individual exhibits similar responses to navigational cues as an unimpaired rider. Motivation is an important factor in therapy, and is frequently determinant of its outcome; therefore, the user attitude toward the therapy methods is key. The answers to questionnaires filled by the rider, family, and the instructor show that our technique gives the rider a greater sense of independence and more joy compared to standard riding where the instructor is walking along with the horse.

Journal article

Takagi A, Usai F, Ganesh G, Sanguineti V, Burdet Eet al., 2018, Haptic communication between humans is tuned by the hard or soft mechanics of interaction, PLoS Computational Biology, Vol: 14, ISSN: 1553-734X

To move a hard table together, humans may coordinate by following the dominant partner's motion [1-4], but this strategy is unsuitable for a soft mattress where the perceived forces are small. How do partners readily coordinate in such differing interaction dynamics? To address this, we investigated how pairs tracked a target using flexion-extension of their wrists, which were coupled by a hard, medium or soft virtual elastic band. Tracking performance monotonically increased with a stiffer band for the worse partner, who had higher tracking error, at the cost of the skilled partner's muscular effort. This suggests that the worse partner followed the skilled one's lead, but simulations show that the results are better explained by a model where partners share movement goals through the forces, whilst the coupling dynamics determine the capacity of communicable information. This model elucidates the versatile mechanism by which humans can coordinate during both hard and soft physical interactions to ensure maximum performance with minimal effort.

Journal article

Bentley P, Burdet E, Rinne P, Mace M, Liardon J-Let al., 2018, A force measurement mechanism, 15544596

Patent

Abdi E, Bouri M, Burdet E, Bleuler Het al., 2018, Development and Comparison of Foot Interfaces for Controlling a Robotic Arm in Surgery, IEEE International Conference on Robotics and Biomimetics (ROBIO), Publisher: IEEE, Pages: 414-420

Conference paper

Mace M, Kinany N, Rinne P, Rayner A, Bentley P, Burdet Eet al., 2017, Balancing the playing field: collaborative gaming for physical training., Journal of NeuroEngineering and Rehabilitation, Vol: 14, ISSN: 1743-0003

BACKGROUND: Multiplayer video games promoting exercise-based rehabilitation may facilitate motor learning, by increasing motivation through social interaction. However, a major design challenge is to enable meaningful inter-subject interaction, whilst allowing for significant skill differences between players. We present a novel motor-training paradigm that allows real-time collaboration and performance enhancement, across a wide range of inter-subject skill mismatches, including disabled vs. able-bodied partnerships. METHODS: A virtual task consisting of a dynamic ball on a beam, is controlled at each end using independent digital force-sensing handgrips. Interaction is mediated through simulated physical coupling and locally-redundant control. Game performance was measured in 16 healthy-healthy and 16 patient-expert dyads, where patients were hemiparetic stroke survivors using their impaired arm. Dual-player was compared to single-player performance, in terms of score, target tracking, stability, effort and smoothness; and questionnaires probing user-experience and engagement. RESULTS: Performance of less-able subjects (as ranked from single-player ability) was enhanced by dual-player mode, by an amount proportionate to the partnership's mismatch. The more abled partners' performances decreased by a similar amount. Such zero-sum interactions were observed for both healthy-healthy and patient-expert interactions. Dual-player was preferred by the majority of players independent of baseline ability and subject group; healthy subjects also felt more challenged, and patients more skilled. CONCLUSION: This is the first demonstration of implicit skill balancing in a truly collaborative virtual training task leading to heightened engagement, across both healthy subjects and stroke patients.

Journal article

Zhou S-H, Tan Y, Oetomo D, Freeman C, Burdet E, Mareels Iet al., 2017, Modeling of Endpoint Feedback Learning Implemented Through Point-to-Point Learning Control, IEEE TRANSACTIONS ON CONTROL SYSTEMS TECHNOLOGY, Vol: 25, Pages: 1576-1585, ISSN: 1063-6536

Journal article

Farkhatdinov I, Roehri N, Burdet E, 2017, Anticipatory detection of turning in humans for intuitive control of robotic mobility assistance, Bioinspiration and Biomimetics, Vol: 12, ISSN: 1748-3182

Many wearable lower-limb robots for walking assistance have been developed in recent years. However, it remains unclear how they can be commanded in an intuitive and efficient way by their user. In particular, providing robotic assistance to neurologically impaired individuals in turning remains a significant challenge. The control should be safe to the users and their environment, yet yield sufficient performance and enable natural human-machine interaction. Here, we propose using the head and trunk anticipatory behaviour in order to detect the intention to turn in a natural, non-intrusive way, and use it for triggering turning movement in a robot for walking assistance. We therefore study head and trunk orientation during locomotion of healthy adults, and investigate upper body anticipatory behaviour during turning. The collected walking and turning kinematics data are clustered using the k-means algorithm and cross-validation tests and k-nearest neighbours method are used to evaluate the performance of turning detection during locomotion. Tests with seven subjects exhibited accurate turning detection. Head anticipated turning by more than 400–500 ms in average across all subjects. Overall, the proposed method detected turning 300 ms after its initiation and 1230 ms before the turning movement was completed. Using head anticipatory behaviour enabled to detect turning faster by about 100 ms, compared to turning detection using only pelvis orientation measurements. Finally, it was demonstrated that the proposed turning detection can improve the quality of human–robot interaction by improving the control accuracy and transparency.

Journal article

Hussain A, Balasubramanian S, Roach N, Klein J, Jarrassé N, Mace M, David A, Guy S, Burdet Eet al., 2017, SITAR: a system for independent task-oriented assessment and rehabilitation, Journal of Rehabilitation and Assistive Technologies Engineering, Vol: 4, Pages: 2055668317729637-2055668317729637, ISSN: 2055-6683

Introduction: Over recent years, task-oriented training has emerged as a dominant approach in neurorehabilitation. This article presents a novel, sensor-based system for independent task-oriented assessment and rehabilitation (SITAR) of the upper limb. Methods: The SITAR is an ecosystem of interactive devices including a touch and force-sensitive tabletop and a set of intelligent objects enabling functional interaction. In contrast to most existing sensor-based systems, SITAR provides natural training of visuomotor coordination through collocated visual and haptic workspaces alongside multimodal feedback, facilitating learning and its transfer to real tasks. We illustrate the possibilities offered by the SITAR for sensorimotor assessment and therapy through pilot assessment and usability studies. Results: The pilot data from the assessment study demonstrates how the system can be used to assess different aspects of upper limb reaching, pick-and-place and sensory tactile resolution tasks. The pilot usability study indicates that patients are able to train arm-reaching movements independently using the SITAR with minimal involvement of the therapist and that they were motivated to pursue the SITAR-based therapy. Conclusion: SITAR is a versatile, non-robotic tool that can be used to implement a range of therapeutic exercises and assessments for different types of patients, which is particularly well-suited for task-oriented training.

Journal article

Abdi E, Bouri M, Burdet E, Himidan S, Bleuler Het al., 2017, Positioning the endoscope in laparoscopic surgery by foot: Influential factors on surgeons' performance in virtual trainer, 39th Annual International Conference of the IEEE-Engineering-in-Medicine-and-Biology-Society (EMBC), Publisher: IEEE, Pages: 3944-3948, ISSN: 1094-687X

Conference paper

Arami A, Tagliamonte NL, Tamburella F, Huang H-Y, Molinari M, Burdet Eet al., 2017, A simple tool to measure spasticity in spinal cord injury subjects., 2017 International Conference on Rehabilitation Robotics (ICORR), Publisher: IEEE

This work presents a wearable device and the algorithms for quantitative modelling of joint spasticity and its application in a pilot group of subjects with different levels of spinal cord injury. The device comprises light-weight instrumented handles to measure the interaction force between the subject and the physical therapist performing the tests, EMG sensors and inertial measurement units to measure muscle activity and joint kinematics. Experimental tests included the passive movement of different body segments, where the spasticity was expected, at different velocities. Tonic stretch reflex thresholds and their velocity modulation factor are computed, as a quantitative index of spasticity, by using the kinematics data at the onset of spasm detected through thresholding the EMG data. This technique was applied to two spinal cord injury subjects. The proposed method allowed the analysis of spasticity at muscle and joint levels. The obtained results are in line with the expert diagnosis and qualitative spasticity characterisation on each individual.

Conference paper

Mace M, Guy S, Hussain A, Playford ED, Ward N, Balasubramanian S, Burdet Eet al., 2017, Validity of a sensor-based table-top platform to measure upper limb function, International Conference on Rehabilitation Robotics (ICORR), Publisher: IEEE, Pages: 652-657, ISSN: 1945-7898

Conference paper

Li R, Li Y, Li SE, Burdet E, Cheng Bet al., 2017, Driver-Automation Indirect Shared Control of Highly Automated Vehicles with Intention-Aware Authority Transition, 28th IEEE Intelligent Vehicles Symposium (IV), Publisher: IEEE, Pages: 26-32, ISSN: 1931-0587

Conference paper

Melendez-Calderon A, Tan M, Fisher Bittmann M, Burdet E, Patton JLet al., 2017, Transfer of dynamic motor skills acquired during isometric training to free motion., Journal of Neurophysiology, Vol: 118, Pages: 219-233, ISSN: 1522-1598

Recent studies have explored the prospects of learning to move without moving, by displaying virtual arm movement related to exerted force. However, it has yet to be tested whether learning the dynamics of moving can transfer to the corresponding movement. Here we present a series of experiments that investigate this isometric training paradigm. Subjects were asked to hold a handle and generate forces as their arms were constrained to a static position. A precise simulation of reaching was used to make a graphic rendering of an arm moving realistically in response to the measured interaction forces and simulated environmental forces. Such graphic rendering was displayed on a horizontal display that blocked their view to their actual (statically constrained) arm and encouraged them to believe they were moving. We studied adaptation of horizontal, planar, goal directed arm movements in a velocity-dependent force-field. Our results show that individuals can learn to compensate for such a force-field in a virtual environment, and transfer their new skills to the actual free motion condition, with performance comparable to practice while moving. Such non-moving techniques should impact various training conditions when moving may not be possible.

Journal article

Martin-Brevet S, Jarrasse N, Burdet E, Roby-Brami Aet al., 2017, Taxonomy based analysis of force exchanges during object grasping and manipulation, PLOS One, Vol: 12, ISSN: 1932-6203

The flexibility of the human hand in object manipulation is essential for daily life activities, but remains relatively little explored with quantitative methods. On the one hand, recent taxonomies describe qualitatively the classes of hand postures for object grasping and manipulation. On the other hand, the quantitative analysis of hand function has been generally restricted to precision grip (with thumb and index opposition) during lifting tasks. The aim of the present study is to fill the gap between these two kinds of descriptions, by investigating quantitatively the forces exerted by the hand on an instrumented object in a set of representative manipulation tasks. The object was a parallelepiped object able to measure the force exerted on the six faces and its acceleration. The grasping force was estimated from the lateral force and the unloading force from the bottom force. The protocol included eleven tasks with complementary constraints inspired by recent taxonomies: four tasks corresponding to lifting and holding the object with different grasp configurations, and seven to manipulating the object (rotation around each of its axis and translation). The grasping and unloading forces and object rotations were measured during the five phases of the actions: unloading, lifting, holding or manipulation, preparation to deposit, and deposit. The results confirm the tight regulation between grasping and unloading forces during lifting, and extend this to the deposit phase. In addition, they provide a precise description of the regulation of force exchanges during various manipulation tasks spanning representative actions of daily life. The timing of manipulation showed both sequential and overlapping organization of the different sub-actions, and micro-errors could be detected. This phenomenological study confirms the feasibility of using an instrumented object to investigate complex manipulative behavior in humans. This protocol will be used in the future to inves

Journal article

Takagi A, Ganesh G, Yoshioka T, Kawato M, Burdet Eet al., 2017, Physically interacting individuals estimate the partner's goal to enhance their movements, Nature Human Behaviour, Vol: 1, ISSN: 2397-3374

From a parent helping to guide their child during their first steps, to a therapist supporting a patient, physical assistance enabled by haptic interaction is a fundamental modus for improving motor abilities. However, what movement information is exchanged between partners during haptic interaction, and how this information is used to coordinate and assist others, remains unclear1. Here, we propose a model in which haptic information, provided by touch and proprioception2, enables interacting individuals to estimate the partner’s movement goal and use it to improve their own motor performance. We use an empirical physical interaction task3 to show that our model can explain human behaviours better than existing models of interaction in literature4,​5,​6,​7,​8. Furthermore, we experimentally verify our model by embodying it in a robot partner and checking that it induces the same improvements in motor performance and learning in a human individual as interacting with a human partner. These results promise collaborative robots that provide human-like assistance, and suggest that movement goal exchange is the key to physical assistance.

Journal article

Mace M, Rinne P, Liardon J-L, Uhomoibhi C, Bentley P, Burdet Eet al., 2017, Elasticity improves handgrip performance and user experience during visuomotor control, Royal Society Open Science, Vol: 4, ISSN: 2054-5703

Passive rehabilitation devices, providing motivation andfeedback, potentially offer an automated and low-cost therapymethod, and can be used as simple human–machine interfaces.Here, we ask whether there is any advantage for a handtrainingdevice to be elastic, as opposed to rigid, in terms ofperformance and preference. To address this question, we havedeveloped a highly sensitive and portable digital handgrip,promoting independent and repetitive rehabilitation of graspfunction based around a novel elastic force and position sensingstructure. A usability study was performed on 66 healthysubjects to assess the effect of elastic versus rigid handgripcontrol during various visuomotor tracking tasks. The resultsindicate that, for tasks relying either on feedforward or onfeedback control, novice users perform significantly betterwith the elastic handgrip, compared with the rigid equivalent(11% relative improvement, 9–14% mean range; p < 0.01).Furthermore, there was a threefold increase in the number ofsubjects who preferred elastic compared with rigid handgripinteraction. Our results suggest that device compliance is animportant design consideration for grip training devices.

Journal article

Huang H-Y, Farkhatdinov I, Arami A, Burdet Eet al., 2017, Modelling Neuromuscular Function of SCI Patients in Balancing, 3rd International Conference on NeuroRehabilitation (ICNR), Publisher: SPRINGER INTERNATIONAL PUBLISHING AG, Pages: 355-359, ISSN: 2195-3562

Conference paper

Takagi A, Beckers N, Burdet E, 2016, Motion plan changes predictably in dyadic reaching, PLOS One, Vol: 11, ISSN: 1932-6203

Parents can effortlessly assist their child to walk, but the mechanism behind such physical coordination is still unknown. Studies have suggested that physical coordination is achieved by interacting humans who update their movement or motion plan in response to the partner's behaviour. Here, we tested rigidly coupled pairs in a joint reaching task to observe such changes in the partners' motion plans. However, the joint reaching movements were surprisingly consistent across different trials. A computational model that we developed demonstrated that the two partners had a distinct motion plan, which did not change with time. These results suggest that rigidly coupled pairs accomplish joint reaching movements by relying on a pre-programmed motion plan that is independent of the partner's behaviour.

Journal article

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.

Request URL: http://wlsprd.imperial.ac.uk:80/respub/WEB-INF/jsp/search-html.jsp Request URI: /respub/WEB-INF/jsp/search-html.jsp Query String: respub-action=search.html&id=00410619&limit=30&person=true