Imperial College London

ProfessorEtienneBurdet

Faculty of EngineeringDepartment of Bioengineering

Professor of Human Robotics
 
 
 
//

Contact

 

e.burdet Website

 
 
//

Location

 

419BBuilding E - Sir Michael UrenWhite City Campus

//

Summary

 

Publications

Publication Type
Year
to

380 results found

Xing X, Burdet E, Si W, Yang C, Li Yet al., 2023, Impedance Learning for Human-Guided Robots in Contact With Unknown Environments, IEEE TRANSACTIONS ON ROBOTICS, Vol: 39, Pages: 3705-3721, ISSN: 1552-3098

Journal article

Hu ZJ, Wang Z, Huang Y, Sena A, Rodriguez y Baena F, Burdet Eet al., 2023, Towards Human-Robot Collaborative Surgery: Trajectory and Strategy Learning in Bimanual Peg Transfer, IEEE ROBOTICS AND AUTOMATION LETTERS, Vol: 8, Pages: 4553-4560, ISSN: 2377-3766

Journal article

Sanmartin-Senent A, Pena-Perez N, Burdet E, Eden Jet al., 2023, Redundancy Resolution in Trimanual vs. Bimanual Tracking Tasks., Annu Int Conf IEEE Eng Med Biol Soc, Vol: 2023, Pages: 1-5

Supernumerary limbs promise to allow users to perform complex tasks that would otherwise require the actions of teams. However, how the user's capability for multimanual coordination compares to bimanual coordination, and how the motor system decides to configure its limb contributions given task redundancy is unclear. We conducted bimanual and trimanual (with the foot as a third-hand controller) virtual reality visuomotor tracking experiments to study how 32 healthy participants changed their limb coordination in response to uninstructed cursor mapping changes. This used a shared cursor mapped to the average limbs' position for different limb combinations. The results show that most participants correctly identified the different mappings during bimanual tracking, and accordingly minimized task-irrelevant motion. Instead during trimanual coordination, participants consistently moved all three limbs concurrently, showing weaker ipsilateral hand-foot coordination. These findings show how redundancy resolution and the resulting coordination patterns differ between similar bimanual and trimanual tasks. Further research is needed to consider the effect of learning on coordination behaviour.

Journal article

Jiang Z, Huang Y, Eden J, Ivanova E, Cheng X, Burdet Eet al., 2023, A virtual reality platform to evaluate the effects of supernumerary limbs' appearance., Annu Int Conf IEEE Eng Med Biol Soc, Vol: 2023, Pages: 1-5

Supernumerary robot limbs (SL) can expand the ability of users by increasing the number of degrees of freedom that they control. While several SLs have been designed and tested on human participants, the effect of the limb's appearance on the user's acceptance, embodiment and device usage is not yet understood. We developed a virtual reality platform with a three-arm avatar that enabled us to systematically investigate the effect of the supernumerary limb's appearance on their perception and motion control performance. A pilot study with 14 participants exhibited similar performance, workload and preference in human-like or robot-like appearance with a trend of preference for the robotic appearance.

Journal article

Ivanova E, Pena-Perez N, Eden J, Yip Y, Burdet Eet al., 2023, Dissociating haptic feedback from physical assistance does not improve motor performance., Annu Int Conf IEEE Eng Med Biol Soc, Vol: 2023, Pages: 1-5

In robots for motor rehabilitation and sports training, haptic assistance typically provides both mechanical guidance and task-relevant information. With the natural human tendency to minimise metabolic cost, mechanical guidance may however prevent efficient short term learning and retention. In this work, we explore the effect of providing haptic feedback to the not active hand during a tracking task. We test four types of haptic feedback: task- or error-related information, no information and irrelevant information. The results show that feedback provided to the hand not carrying out the tracking task did not improve task performance. However, irrelevant information to the task worsened performance, and negatively influenced the participants' perception of helpfulness, assistance, likability and predictability.

Journal article

Uttayopas P, Cheng X, Eden J, Burdet Eet al., 2023, Object Recognition Using Mechanical Impact, Viscoelasticity, and Surface Friction During Interaction, IEEE TRANSACTIONS ON HAPTICS, Vol: 16, Pages: 251-260, ISSN: 1939-1412

Journal article

Wang Z, Lam H-K, Guo Y, Xiao B, Li Y, Su X, Yeatman EM, Burdet Eet al., 2023, Adaptive Event-Triggered Control for Nonlinear Systems With Asymmetric State Constraints: A Prescribed-Time Approach, IEEE TRANSACTIONS ON AUTOMATIC CONTROL, Vol: 68, Pages: 3625-3632, ISSN: 0018-9286

Journal article

Mastria G, Scaliti E, Mehring C, Burdet E, Becchio C, Serino A, Akselrod Met al., 2023, Morphology, connectivity, and encoding features of tactile and motor representations of the fingers in the human precentral and postcentral gyrus, The Journal of Neuroscience, Vol: 43, Pages: 1572-1589, ISSN: 0270-6474

Despite the tight coupling between sensory and motor processing for fine manipulation in humans, it is not yet totally clear which specific properties of the fingers are mapped in the precentral and postcentral gyrus. We used fMRI to compare the morphology, connectivity, and encoding of the motor and tactile finger representations (FRs) in the precentral and postcentral gyrus of 25 5-fingered participants (8 females). Multivoxel pattern and structural and functional connectivity analyses demonstrated the existence of distinct motor and tactile FRs within both the precentral and postcentral gyrus, integrating finger-specific motor and tactile information. Using representational similarity analysis, we found that the motor and tactile FRs in the sensorimotor cortex were described by the perceived structure of the hand better than by the actual hand anatomy or other functional models (finger kinematics, muscles synergies). We then studied a polydactyly individual (i.e., with a congenital 6-fingered hand) showing superior manipulation abilities and divergent anatomic-functional hand properties. The perceived hand model was still the best model for tactile representations in the precentral and postcentral gyrus, while finger kinematics better described motor representations in the precentral gyrus. We suggest that, under normal conditions (i.e., in subjects with a standard hand anatomy), the sensorimotor representations of the 5 fingers in humans converge toward a model of perceived hand anatomy, deviating from the real hand structure, as the best synthesis between functional and structural features of the hand.SIGNIFICANCE STATEMENT Distinct motor and tactile finger representations exist in both the precentral and postcentral gyrus, supported by a finger-specific pattern of anatomic and functional connectivity across modalities. At the representational level, finger representations reflect the perceived structure of the hand, which might result from an adapting process

Journal article

Farina D, Burdet E, Mehring C, Ibanez J, Philpot Cet al., 2023, Roboticists Want to Give You a Third Arm: Unused Bandwidth in Neurons Can be Tapped to Control Extra Limbs, IEEE SPECTRUM, Vol: 60, Pages: 22-+, ISSN: 0018-9235

Journal article

Börner H, Carboni G, Cheng X, Takagi A, Hirche S, Endo S, Burdet Eet al., 2023, Physically interacting humans regulate muscle coactivation to improve visuo-haptic perception., Journal of Neurophysiology, Vol: 129, Pages: 494-499, ISSN: 0022-3077

When moving a piano or dancing tango with a partner, how should I control my arm muscles to sense their movements and follow or guide them smoothly? Here we observe how physically connected pairs tracking a moving target with the arm modify muscle coactivation with their visual acuity and the partner's performance. They coactivate muscles to stiffen the arm when the partner's performance is worse and relax with blurry visual feedback. Computational modeling shows that this adaptive sensing property cannot be explained by the minimization of movement error hypothesis that has previously explained adaptation in dynamic environments. Instead, individuals skillfully control the stiffness to guide the arm toward the planned motion while minimizing effort and extracting useful information from the partner's movement. The central nervous system regulates muscle activation to guide motion with accurate task information from vision and haptics while minimizing the metabolic cost. As a consequence, the partner with the most accurate target information leads the movement.NEW & NOTEWORTHY Our results reveal that interacting humans inconspicuously modulate muscle activation to extract accurate information about the common target while considering their own and the partner's sensorimotor noise. A novel computational model was developed to decipher the underlying mechanism: muscle coactivation is adapted to combine haptic information from the interaction with the partner and own visual information in a stochastically optimal manner. This improves the prediction of the target position with minimal metabolic cost in each partner, resulting in the lead of the partner with the most accurate visual information.

Journal article

Cheng Y, Huang Y, Wang Z, Burdet Eet al., 2023, Foot gestures to control the grasping of a surgical robot, Pages: 6844-6850, ISSN: 1050-4729

Many surgical tasks require three or more tools working together, where a hands-free interface could extend a surgeon's actions to control a third surgical tool. However, most current interfaces do not allow skilled control of grasping critical to robotic manipulation. Here we first present a systematic study to identify efficient and intuitive interaction strategies to control grasping of a surgical tool. A series of experiments were conducted to evaluate six foot pressure-based gestures. Based on the results, three modular novel foot-machine interfaces were developed, which can be integrated with other motion control interfaces. The identified interaction strategies were implemented to control a laparoscopic tool in a surgical simulator, and evaluated in a user study. The results illustrate how naive participants can operate grasping yielding smooth and pick & place operation.

Conference paper

Wang Z, Fei H, Huang Y, Rouxel Q, Xiao B, Li Z, Burdet Eet al., 2023, Learning to Assist Bimanual Teleoperation using Interval Type-2 Polynomial Fuzzy Inference, IEEE Transactions on Cognitive and Developmental Systems, ISSN: 2379-8920

Assisting humans in collaborative tasks is a promising application for robots, however effective assistance remains challenging. In this paper, we propose a method for providing intuitive robotic assistance based on learning from human natural limb coordination. To encode coupling between multiple-limb motions, we use a novel interval type-2 (IT2) polynomial fuzzy inference for modeling trajectory adaptation. The associated polynomial coefficients are estimated using a modified recursive least-square with a dynamic forgetting factor. We propose to employ a Gaussian process to produce robust human motion predictions, and thus address the uncertainty and measurement noise of the system caused by interactive environments. Experimental results on two types of interaction tasks demonstrate the effectiveness of this approach, which achieves high accuracy in predicting assistive limb motion and enables humans to perform bimanual tasks using only one limb.

Journal article

Dutta A, Burdet E, Kaboli M, 2023, Push to Know! - Visuo-Tactile Based Active Object Parameter Inference with Dual Differentiable Filtering, Pages: 3137-3144, ISSN: 2153-0858

For robotic systems to interact with objects in dynamic environments, it is essential to perceive the physical properties of the objects such as shape, friction coefficient, mass, center of mass, and inertia. This not only eases selecting manipulation action but also ensures the task is performed as desired. However, estimating the physical properties of especially novel objects is a challenging problem, using either vision or tactile sensing. In this work, we propose a novel framework to estimate key object parameters using non-prehensile manipulation using vision and tactile sensing. Our proposed active dual differentiable filtering (ADDF) approach as part of our framework learns the object-robot interaction during non-prehensile object push to infer the object's parameters. Our proposed method enables the robotic system to employ vision and tactile information to interactively explore a novel object via non-prehensile object push. The novel proposed N-step active formulation within the differentiable filtering facilitates efficient learning of the object-robot interaction model and during inference by selecting the next best exploratory push actions (where to push? and how to push?). We extensively evaluated our framework in simulation and real-robotic scenarios, yielding superior performance to the state-of-the-art baseline.

Conference paper

Pena-Perez N, Eden J, Ivanova E, Farkhatdinov I, Burdet Eet al., 2023, How virtual and mechanical coupling impact bimanual tracking, Journal of Neurophysiology, Vol: 129, Pages: 102-114, ISSN: 0022-3077

Bilateral training systems look to promote the paretic hand’s use in individuals with hemiplegia. Although this is normally achieved using mechanical coupling (i.e., a physical connection between the hands), a virtual reality system relying on virtual coupling (i.e., through a shared virtual object) would be simpler to use and prevent slacking. However, it is not clear whether different coupling modes differently impact task performance and effort distribution between the hands. We explored how 18 healthy right-handed participants changed their motor behaviors in response to the uninstructed addition of mechanical coupling, and virtual coupling using a shared cursor mapped to the average hands’ position. In a second experiment, we then studied the impact of connection stiffness on performance, perception, and effort imbalance. The results indicated that both coupling types can induce the hands to actively contribute to the task. However, the task asymmetry introduced by using a cursor mapped to either the left or right hand only modulated the hands’ contribution when not mechanically coupled. The tracking performance was similar for all coupling types, independent of the connection stiffness, although the mechanical coupling was preferred and induced the hands to move with greater correlation. These findings suggest that virtual coupling can induce the hands to actively contribute to a task in healthy participants without hindering their performance. Further investigation on the coupling types’ impact on the performance and hands’ effort distribution in patients with hemiplegia could allow for the design of simpler training systems that promote the affected hand’s use.

Journal article

Devillard A, Ramasamy A, Faux D, Hayward V, Burdet Eet al., 2023, Concurrent Haptic, Audio, and Visual Data Set during Bare Finger Interaction with Textured Surfaces, Pages: 101-106

Perceptual processes are frequently multi-modal. This is the case of haptic perception. Data sets of visual and haptic sensory signals have been compiled in the past, especially when it comes to the exploration of textured surfaces. These data sets were intended to be used in natural and artificial perception studies and to provide training data sets for machine learning research. These data sets were typically acquired with rigid probes or artificial robotic fingers. Here, we collected visual, auditory, and haptic signals acquired when a human finger explored textured surfaces. We assessed the data set via machine learning classification techniques. Interestingly, multimodal classification performance could reach 97% when haptic classification was around 80%.

Conference paper

Uttayopas P, Cheng X, Burdet E, 2023, Active Haptic Exploration based on Dual-Stage Perception for Object Recognition, IEEE World Haptics Conference (WHC), Publisher: IEEE, Pages: 347-353, ISSN: 2835-9518

Conference paper

Cazenave L, Yurkewich A, Hohler C, Keller T, Krewer C, Jahn K, Hirche S, Endo S, Burdet Eet al., 2023, Hybrid Functional Electrical Stimulation and Robotic Assistance for Wrist Motion Training after Stroke: Preliminary Results, 2023 INTERNATIONAL CONFERENCE ON REHABILITATION ROBOTICS, ICORR, ISSN: 1945-7898

Journal article

Nehrujee A, Ivanova E, Srinivasan S, Balasubramanian S, Burdet Eet al., 2023, Increasing the Motivation to Train Through Haptic Social Interaction - Pilot study, 2023 INTERNATIONAL CONFERENCE ON REHABILITATION ROBOTICS, ICORR, ISSN: 1945-7898

Journal article

Kunavar T, Cheng X, Franklin DW, Burdet E, Babič Jet al., 2023, Explicit learning based on reward prediction error facilitates agile motor adaptations., PLoS One, Vol: 18

Error based motor learning can be driven by both sensory prediction error and reward prediction error. Learning based on sensory prediction error is termed sensorimotor adaptation, while learning based on reward prediction error is termed reward learning. To investigate the characteristics and differences between sensorimotor adaptation and reward learning, we adapted a visuomotor paradigm where subjects performed arm movements while presented with either the sensory prediction error, signed end-point error, or binary reward. Before each trial, perturbation indicators in the form of visual cues were presented to inform the subjects of the presence and direction of the perturbation. To analyse the interconnection between sensorimotor adaptation and reward learning, we designed a computational model that distinguishes between the two prediction errors. Our results indicate that subjects adapted to novel perturbations irrespective of the type of prediction error they received during learning, and they converged towards the same movement patterns. Sensorimotor adaptations led to a pronounced aftereffect, while adaptation based on reward consequences produced smaller aftereffects suggesting that reward learning does not alter the internal model to the same degree as sensorimotor adaptation. Even though all subjects had learned to counteract two different perturbations separately, only those who relied on explicit learning using reward prediction error could timely adapt to the randomly changing perturbation. The results from the computational model suggest that sensorimotor and reward learning operate through distinct adaptation processes and that only sensorimotor adaptation changes the internal model, whereas reward learning employs explicit strategies that do not result in aftereffects. Additionally, we demonstrate that when humans learn motor tasks, they utilize both learning processes to successfully adapt to the new environments.

Journal article

Takagi A, Bagnato C, Melendez-Calderon A, Jarrasse N, Ganesh G, Burdet Eet al., 2023, Competition Increases the Effort Put Into a Physical Interaction Task., IEEE Trans Haptics, Vol: 16, Pages: 719-725

Physical interaction can enhance motor learning, but it remains unclear what type of interaction is best suited to increasing the active effort put into a task, which should support learning. Here, we used the same interactive tracking task with different instructions to induce three training conditions: competition, collaboration, and self-improvement, where partners improve their own performance while interacting haptically with each other. The effort was gauged by measuring the total normalized muscle activity. Feedback of task performance and the haptic dynamics were identical in all three training conditions, so the effort needed to complete the task was the same. Only the instructions to 'compete with the partner', 'improve your and your partner's accuracy' and 'improve your accuracy' were different among the competition, collaboration, and self-improvement conditions, respectively. Despite having the same goal of maximizing self-performance during competition and self-improvement, participants exerted significantly more effort during competition, and their tracking accuracy was highest during competitive practice. Least effort was put into collaboration but tracking accuracy during collaboration was comparable to self-improvement. Our results suggest that interactive haptic competition can induce higher active drive or effort than either collaborative training or self-focused practice.

Journal article

Huang Y, Eden J, Ivanova E, Burdet Eet al., 2023, Can Training Make Three Arms Better Than Two Heads for Trimanual Coordination?, IEEE OPEN JOURNAL OF ENGINEERING IN MEDICINE AND BIOLOGY, Vol: 4, Pages: 148-155

Journal article

Pena-Perez N, Mutalib SA, Eden J, Farkhatdinov I, Burdet Eet al., 2023, The Impact of Stiffness in Bimanual Versus Dyadic Interactions Requiring Force Exchange., IEEE Trans Haptics, Vol: 16, Pages: 609-615

During daily activities, humans routinely manipulate objects bimanually or with the help of a partner. This work explored how bimanual and dyadic coordination modes are impacted by the object's stiffness, which conditions inter-limb haptic communication. For this, we recruited 20 healthy participants who performed a virtual task inspired by object handling, where we looked at the initiation of force exchange and its continued maintenance while tracking. Our findings suggest that while individuals and dyads displayed different motor behaviours, which may stem from the dyad members' need to estimate their partner's actions, they exhibited similar tracking accuracy. For both coordination modes, increased stiffness resulted in better tracking accuracy and more correlated motions, but required a larger effort through increased average torque. These results suggest that stiffness may be a key consideration in applications such as rehabilitation, where bimanual or external physical assistance is often provided.

Journal article

Cazenave L, Einenkel M, Yurkewich A, Endo S, Hirche S, Burdet Eet al., 2023, Hybrid robotic and electrical stimulation assistance can enhance performance and reduce mental demand, IEEE Transactions on Neural Systems and Rehabilitation Engineering, Vol: 31, Pages: 4063-4072, ISSN: 1534-4320

Combining functional electrical stimulation (FES) and robotics may enhance recovery after stroke, by providing neural feedback with the former while improv- ing quality of motion and minimizing muscular fatigue with the latter. Here, we explored whether and how FES, robot assistance and their combination, affect users’ per- formance, effort, fatigue and user experience. 15 healthy participants performed a wrist flexion/extension tracking task with FES and/or robotic assistance. Tracking per- formance improved during the hybrid FES-robot and the robot-only assistance conditions in comparison to no assistance, but no improvement is observed when only FES is used. Fatigue, muscular and voluntary effort are estimated from electromyographic recording. Total muscle contraction and volitional activity are lowest with robotic assistance, whereas fatigue level do not change between the conditions. The NASA-Task Load Index answers indi- cate that participants found the task less mentally demand- ing during the hybrid and robot conditions than the FES condition. The addition of robotic assistance to FES train- ing might thus facilitate an increased user engagement compared to robot training and allow longer motor training session than with FES assistance.

Journal article

Carboni G, Nanayakkara T, Takagi A, Burdet Eet al., 2022, Adapting the visuo-haptic perception through muscle coactivation (vol 11, 21986, 2021), SCIENTIFIC REPORTS, Vol: 12, ISSN: 2045-2322

Journal article

Yurkewich A, Ortega S, Sanchez J, Wang RH, Burdet Eet al., 2022, Integrating hand exoskeletons into goal-oriented clinic and home stroke and spinal cord injury rehabilitation, Journal of Rehabilitation and Assistive Technologies Engineering, Vol: 9, Pages: 1-11, ISSN: 2055-6683

IntroductionRobotic exoskeletons are emerging as rehabilitation and assistive technologies that simultaneously restore function and enable independence for people with disabilities.AimWe investigated the feasibility and orthotic and restorative effects of an exoskeleton-supported goal-directed rehabilitation program for people with hand impairments after stroke or Spinal Cord Injury (SCI).MethodA single-arm case-series feasibility study was conducted using a wearable untethered hand exoskeleton during goal-directed therapy programs with in-clinic and at-home components. Therapists trained stroke and SCI patients to use a hand exoskeleton during rehabilitation exercises, activities of daily living and patient-selected goals. Each patient received a 1-hour in-clinic training session on five consecutive days, then took the exoskeleton home for two consecutive days to perform therapist-recommended tasks. Goal Attainment Scaling (GAS) and the Box and Block Test (BBT) were administered at baseline, after in-clinic therapy and after home use, with and again without wearing the exoskeleton. The System Usability Scale (SUS), Motor Activity Log, and Fugl-Meyer Assessment were also administered to assess the intervention’s acceptability, adherence, usability and effectiveness.ResultsFour stroke patients (Chedoke McMaster Stage of Hand 2–4) and one SCI patient (ASIA C8 Motor Stage 1) 23 ± 19 months post-injury wore the hand exoskeleton to perform 280 ± 23 exercise repetitions in the clinic and additional goal-oriented tasks at home. The patients performed their own goals and the dexterity task with higher performance following the 7-days therapy program in comparison to baseline for both exoskeleton-assisted (ΔGAS: 18 ± 10, ΔBBT: 1 ± 5) and unassisted (ΔGAS: 14 ± 14, ΔBBT: 3 ± 4) assessments. Therapists and patients provided ‘good’ SUS ratings of 78 ± 6 and no harmful events were re

Journal article

Ivanova E, Eden J, Carboni G, Krueger J, Burdet Eet al., 2022, Interaction with a reactive partner improves learning in contrast to passive guidance, SCIENTIFIC REPORTS, Vol: 12, ISSN: 2045-2322

Journal article

Takagi A, Gomi H, Burdet E, Koike Yet al., 2022, A model predictive control strategy to regulate movements and interactions

<jats:title>Abstract</jats:title><jats:p>Humans are adept at moving the arm to interact with objects and surfaces. The brain is thought to regulate motion and interactions using two different controllers, one specialized for movements and the other for force regulation. However, it remains unclear whether different control mechanisms are necessary. Here we show that the brain can employ a single high-level control strategy for both movement and interaction control. The Model Predictive Control (MPC) strategy introduced in this paper uses an internal model of the environment to plan the arm’s muscle activity whilst updating its predictions using periodic feedback. Computer simulations demonstrate MPC’s ability to produce human-like movements and after-effects in free and force field environments. It can simultaneously regulate both force and stiffness during interactions, and can accomplish motor tasks demanding transitions between motion and interaction control. Model Predictive Control promises to be an important tool to test ideas of motor control as it can handle nonlinear dynamics with changing environments and goals without having to specify the movement duration.</jats:p>

Journal article

Li Y, Sena A, Wang Z, Xing X, Babic J, van Asseldonk E, Burdet Eet al., 2022, A review on interaction control for contact robots through intent detection, Progress in Biomedical Engineering, Vol: 4, Pages: 1-21, ISSN: 2516-1091

Interaction control presents opportunities for contact robots physically interacting with their human user, such as assistance targeted to each human user, communication of goals to enable effective teamwork, and task-directed motion resistance in physical training and rehabilitation contexts. Here we review the burgeoning field of interaction control in the control theory and machine learning communities, by analysing the exchange of haptic information between the robot and its human user, and how they share the task effort. We first review the estimation and learning methods to predict the human user intent with the large uncertainty, variability and noise and limited observation of human motion. Based on this motion intent core, typical interaction control strategies are described using a homotopy of shared control parameters. Recent methods of haptic communication and game theory are then presented to consider the co-adaptation of human and robot control and yield versatile interactive control as observed between humans. Finally, the limitations of the presented state of the art are discussed and directions for future research are outlined.

Journal article

Huang Y, Eden J, Ivanova E, Burdet Eet al., 2022, Human Performance of Three Hands in Unimanual, Bimanual and Trimanual Tasks., Annu Int Conf IEEE Eng Med Biol Soc, Vol: 2022, Pages: 1493-1497

Trimanual operation using a robotic supernumerary limb is a new and challenging mechanism for human operators that could enable a single user to perform tasks requiring more than two hands. Foot-controlled interfaces have previously proven able to be intuitively controlled, enabling simple tasks to be performed. However, the effect of going from unimanual to bimanual and then to trimanual tasks on subjects performance and coordination is not well understood. In this paper, unimanual, bimanual and trimanual teleoperation tasks were performed in a virtual reality scene to evaluate the impact of extending to trimanual actions. 15 participants were required to move their limbs together in a coordinated reaching activity. The results show that the addition of another hand resulted in an increase in operating time, where the time increased in going from unimanual to bimanual operation and then increased further when going from bimanual to trimanual. Moreover, the success rate for performing bimanual and trimanual tasks was strongly influenced by the subject's performance in ipsilateral hand-foot activities, where the ipsilateral combination had a lower success rate than contralateral limbs. The addition of a hand did not affect any two-hand coordination rate and even in some cases reduced coordination deviations. Clinical relevance - This work can contribute to build efficient training and learning framework on human multiple limbs motion control and coordination for both rehabilitation and augmentation.

Journal article

Farkhatdinov I, Garnier A, Arichi T, Bleuler H, Burdet Eet al., 2022, Evaluation of a Portable fMRI Compatible Robotic Wrist Interface., Annu Int Conf IEEE Eng Med Biol Soc, Vol: 2022, Pages: 2535-2539

This paper presents evaluation of a portable fMRI compatible haptic interface to study the brain correlates of sensorimotor control during wrist motion. The interface is actuated by a shielded DC motor located more than 2 m away from the 3T MR scanner's bore. The achievable wrist torque of the interface is up to 2 Nm, and the interface provides sufficient bandwidth for human motor control experiments. Ergonomic and fMRI compatibility testing with a 3T MR scanner showed that the interface is MR safe, compatible with a strong static magnetic field and radio frequency emission, and its operation does not affect the quality of the acquired images. Clinical Relevance- We present and evaluate an fMRI compatible robotic interface to study human wrist joint motor function.

Journal article

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.

Request URL: http://wlsprd.imperial.ac.uk:80/respub/WEB-INF/jsp/search-html.jsp Request URI: /respub/WEB-INF/jsp/search-html.jsp Query String: respub-action=search.html&id=00410619&limit=30&person=true