52 results found
Ratclife J, Soave F, Bryan-Kinns N, et al., 2021, Extended reality (xr) remote research: A survey of drawbacks and opportunities, Conference on Human Factors in Computing Systems - Proceedings
Extended Reality (XR) technology -such as virtual and augmented reality -is now widely used in Human Computer Interaction (HCI), social science and psychology experimentation. However, these experiments are predominantly deployed in-lab with a co-present researcher. Remote experiments, without co-present researchers, have not fourished, despite the success of remote approaches for non-XR investigations. This paper summarises fndings from a 30-item survey of 46 XR researchers to understand perceived limitations and benefts of remote XR experimentation. Our thematic analysis identifes concerns common with non-XR remote research, such as participant recruitment, as well as XR-specifc issues, including safety and hardware variability. We identify potential positive afordances of XR technology, including leveraging data collection functionalities builtin to HMDs (e.g. hand, gaze tracking) and the portability and reproducibility of an experimental setting. We suggest that XR technology could be conceptualised as an interactive technology and a capable data-collection device suited for remote experimentation.
Sun F, Zang W, Huang H, et al., 2021, Accelerometer-Based Key Generation and Distribution Method for Wearable IoT Devices, IEEE INTERNET OF THINGS JOURNAL, Vol: 8, Pages: 1636-1650, ISSN: 2327-4662
Huang HY, Farkhatdinov I, Arami A, et al., 2021, Cable-driven robotic interface for lower limb neuromechanics identification, IEEE Transactions on Biomedical Engineering, Vol: 68, Pages: 461-469, ISSN: 0018-9294
This paper presents a versatile cable-driven robotic interface to investigate the single-joint joint neuromechanics of the hip, knee and ankle in the sagittal plane. This endpoint-based interface offers highly dynamic interaction and accurate position control (as is typically required for neuromechanics identification), and provides measurements of position, interaction force and EMG of leg muscles. It can be used with the subject upright, corresponding to a natural posture during walking or standing, and does not impose kinematic constraints on a joint, in contrast to existing interfaces. Mechanical evaluations demonstrated that the interface yields a rigidity above 500 N/m with low viscosity. Tests with a rigid dummy leg and linear springs show that it can identify the mechanical impedance of a limb accurately. A smooth perturbation is developed and tested with a human subject, which can be used to estimate the hip neuromechanics.
Din AR, Althoefer K, Farkhatdinov I, et al., 2021, Innovation in the time of SARS-CoV-2: A collaborative journey between NHS clinicians, engineers, academics and industry., Surgeon, ISSN: 1479-666X
During the pandemic healthcare faced great pressure on the availability of protective equipment. This paper describes the entire novel innovative process of design optimisation, production and deployment of face-visors to NHS frontline workers during SARS-CoV-2 pandemic. The described innovative journey spans collaboration between clinicians and academic colleagues for design to the implementation with industry partners of a face-visor for use in a healthcare setting. It identifies the enablers and barriers to development along with the strategies employed to produce a certified reusable, adjustable, high volume and locally produced face-visor. The article also explores aspects of value, scalability, spread and sustainability all of which are essential features of innovation.
Huang H-Y, Farkhatdinov I, Arami A, et al., 2021, Cable-Driven Robotic Interface for Lower Limb Neuromechanics Identification., IEEE Trans. Biomed. Eng., Vol: 68, Pages: 461-469
Palermo F, Konstantinova J, Althoefer K, et al., 2020, Automatic Fracture Characterization Using Tactile and Proximity Optical Sensing, FRONTIERS IN ROBOTICS AND AI, Vol: 7, ISSN: 2296-9144
Huang HY, Arami A, Farkhatdinov I, et al., 2020, The Influence of Posture, Applied Force and Perturbation Direction on Hip Joint Viscoelasticity, IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, Vol: 28, Pages: 1138-1145, ISSN: 1534-4320
Palermo F, Konstantinova J, Althoefer K, et al., 2020, Implementing Tactile and Proximity Sensing for Crack Detection, Pages: 632-637, ISSN: 1050-4729
Remote characterisation of the environment during physical robot-environment interaction is an important task commonly accomplished in telerobotics. This paper demonstrates how tactile and proximity sensing can be efficiently used to perform automatic crack detection. A custom-designed integrated tactile and proximity sensor is implemented. It measures the deformation of its body when interacting with the physical environment and distance to the environment's objects with the help of fibre optics. This sensor was used to slide across different surfaces and the data recorded during the experiments was used to detect and classify cracks, bumps and undulations. The proposed method uses machine learning techniques (mean absolute value as feature and random forest as classifier) to detect cracks and determine their width. An average crack detection accuracy of 86.46% and width classification accuracy of 57.30% is achieved. Kruskal-Wallis results (p<0.001) indicate statistically significant differences among results obtained when analysing only force data, only proximity data and both force and proximity data. In contrast to previous techniques, which mainly rely on visual modality, the proposed approach based on optical fibres is suitable for operation in extreme environments, such as nuclear facilities in which nuclear radiation may damage the electronic components of video cameras.
Soave F, Bryan-Kinns N, Farkhatdinov I, 2020, A preliminary study on full-body haptic stimulation on modulating self-motion perception in virtual reality, Pages: 461-469, ISSN: 0302-9743
We introduce a novel experimental system to explore the role of vibrotactile haptic feedback in Virtual Reality (VR) to induce the self-motion illusion. Self-motion (also called vection) has been mostly studied through visual and auditory stimuli and a little is known how the illusion can be modulated by the addition of vibrotactile feedback. Our study focuses on whole-body haptic feedback in which the vibration is dynamically generated from the sound signal of the Virtual Environment (VE). We performed a preliminary study and found that audio and haptic modalities generally increase the intensity of vection over a visual only stimulus. We observe higher ratings of self-motion intensity when the vibrotactile stimulus is added to the virtual scene. We also analyzed data obtained with the igroup presence questionnaire (IPQ) which shows that haptic feedback has a general positive effect of presence in the virtual environment and a qualitative survey that revealed interesting and often overlooked aspects such as the implications of using a joystick to collect data in perception studies and in the concept of vection in relation to people’s experience and cognitive interpretation of self-motion.
Brown JP, Farkhatdinov I, 2020, Soft Haptic Interface based on Vibration and Particle Jamming, IEEE Haptics Symposium (HAPTICS), Publisher: IEEE, Pages: 1-6, ISSN: 2324-7347
Junput B, Farkhatdinov I, Jamone L, 2020, Touch It, Rub It, Feel It! Haptic Rendering of Physical Textures with a Low Cost Wearable System, Pages: 274-286, ISSN: 0302-9743
Information about the texture of an object’s surface is crucial for its recognition and robust manipulation. During robotic teleoperation or interaction with a Virtual Reality, is important to feedback such information to the human user. However, most available solutions for haptic feedback are expensive and/or cumbersome. In this paper we propose a low cost and wearable system that allows users to feel the texture of physical objects by virtually rubbing them. Our main contributions are: i) a system for encoding a virtual representation of the texture of physical materials; ii) a system to haptically render such virtual representation on the user fingertips; iii) an experimental validation of the combined system in a object recognition task. We show that users can successfully recognize physical objects with different textures by virtually rubbing their surfaces using the proposed system.
Omarali B, Denoun B, Althoefer K, et al., 2020, Virtual Reality based Telerobotics Framework with Depth Cameras, 29th IEEE International Conference on Robot and Human Interactive Communication (IEEE RO-MAN), Publisher: IEEE, Pages: 1217-1222, ISSN: 1944-9445
Perez NP, Tokarchuk L, Burdet E, et al., 2019, Exploring user motor behaviour in bimanual interactive video games, ISSN: 2325-4270
Video games have proved very valuable in rehabilitation technologies. They guide therapy and keep patients engaged and motivated. However, in order to realize their full potential, a good understanding is required of the players' motor control. In particular, little is known regarding player behaviour in tasks demanding bimanual interaction. In this work, an experiment was designed to improve the understanding of such tasks. A driving game was developed in which players were asked to guide a differential wheeled robot (depicted as a rocket) along a trajectory. The rocket could be manipulated by using an Xbox controller's triggers, each supplying torque to the corresponding side of the robot. Such a task is redundant, i.e. there exists an infinite number of input combinations to yield a given outcome. This allows the player to strategize according to their own preference. 10 participants were recruited to play this game and their input data was logged for subsequent analysis. Two different motor strategies were identified: an "intermittent" input pattern versus a "continuous" one. It is hypothesized that the choice of behaviour depends on motor skill and minimization of effort and error. Further testing is necessary to determine the exact relationship between these aspects.
Farkhatdinov I, Michalska H, Berthoz A, et al., 2019, Idiothetic Verticality Estimation Through Head Stabilization Strategy, IEEE ROBOTICS AND AUTOMATION LETTERS, Vol: 4, Pages: 2677-2682, ISSN: 2377-3766
Farkhatdinov I, Ebert J, van Oort G, et al., 2019, Assisting human balance in standing with a robotic exoskeleton, IEEE Robotics and Automation Letters, Vol: 4, Pages: 414-421, ISSN: 2377-3766
This letter presents an experimental study on balance recovery control with a lower limb exoskeleton robot. Four participants were subjected to a perturbation during standing, a forward force impulse applied to their pelvis that forced them to step forward with the right leg for balance recovery. Trials with and without exoskeleton assistance to move the stepping legs thigh were conducted to investigate the influence of the exoskeletons control assistance on balancing performance and a potential adaptation. Analysis of the body kinematics and muscle activation demonstrates that robotic assistance: first, was easy to use and did not require learning, nor inhibited the healthy stepping behavior; second, it modified the stepping leg trajectories by increasing hip and knee movement; third, increased reaction speed and decreased the step duration; and finally, generally increased biceps femoris and rectus femoris muscle activity.
Farkhatdinov I, Michalska H, Berthozand A, et al., 2019, Gravito-inertial ambiguity resolved through head stabilization, PROCEEDINGS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, Vol: 475, ISSN: 1364-5021
Otaran A, Farkhatdinov I, 2019, Modeling and Control of Ankle Actuation Platform for Human-Robot Interaction, Pages: 338-348, ISSN: 0302-9743
We present the design of a one-degree-of-freedom ankle actuation platform for human-robot interaction. The platform is actuated with a DC motor through a capstan drive mechanism. The results for platform dynamics identification including friction characterisation are presented. Control experiments demonstrate that a linear regulator with gravity compensation can be used to control the inclination of the platform efficiently.
Omarali B, Palermo F, Valle M, et al., 2019, Position and Velocity Control for Telemanipulation with Interoperability Protocol, Pages: 316-324, ISSN: 0302-9743
In this paper we describe how a generic interoperability telerobotics protocol can be applied for master-slave robotic systems operating in position-position, position-speed and hybrid control modes. The interoperability protocol allows robust and efficient data exchange for teleoperation systems, however it was not shown how it can fit switching position and rate control modes. Here we propose the general framework of hybrid position and rate control modes with interoperability protocol. Furthermore, we demonstrate experimentally that the framework is suitable for robotics teleoperation systems in which a human-operator can switch between position-position and position-speed master and slave robots’ workspace mapping.
Farkhatdinov I, Michalska H, Berthoz A, et al., 2019, Idiothetic Verticality Estimation through Head Stabilization Strategy, 15th Conference on Robotics - Science and Systems, Publisher: MIT PRESS
Danabek D, Otaran A, Althoefer K, et al., 2019, Mobile robot trajectory analysis with the help of vision system, Pages: 273-279, ISSN: 0302-9743
We present a vision-based motion analysis method for single and multiple mobile robots which allows quantifying the robot's behaviour. The method defines how often and for how much each of the robots turn and move straight. The motion analysis relies on the robot trajectories acquired online or offline by an external camera and the algorithm is based on iteratively performed a linear regression to detect straight and curved paths for each robot. The method is experimentally validated with the indoor mobile robotic system. Potential applications include remote robot inspection, rescue robotics and multi-robotic system coordination.
Farkhatdinov I, Michalaska H, Berthoz A, et al., 2018, Review of Anthropomorphic Head Stabilisation and Verticality Estimation in Robots, Biomechanics of Anthropomorphic Systems
ogrinc M, Farkhatdinov I, Walker R, et al., 2018, Sensory integration of apparent motion speed and vibration magnitude, IEEE Transactions on Haptics, Vol: 11, Pages: 455-463, ISSN: 1939-1412
Tactile apparent motion can display directional information in an intuitive way. It can for example be used to give directions to visually impaired individuals, or for waypoint navigation while cycling on busy streets, when vision or audition should not be loaded further. However, although humans can detect very short tactile patterns, discriminating between similar motion speeds has been shown to be difficult. Here we develop and investigate a method where the speed of tactile apparent motion around the user & #x0027;s wrist is coupled with vibration magnitude. This redundant coupling is used to produce tactile patterns from slow & amp;weak to fast & amp;strong. We compared the just noticeable difference (JND) of the coupled and the individual variables. The results show that the perception of the coupled variable can be characterised by JND smaller than JNDs of the individual variables. This allowed us to create short tactile pattens (tactons) for display of direction and speed, which can be distinguished significantly better than tactons based on motion alone. Additionally, most subjects were also able to identify the coupled-variable tactons better than the magnitude-based tactons.
Ogrinc Ms M, Farkhatdinov PhD I, Walker Ms R, et al., 2018, Horseback riding therapy for a deafblind individual enabled by a haptic interface, Assistive Technology: The Offical Journal of RESNA, Vol: 30, Pages: 143-150, ISSN: 1949-3614
We present a haptic interface to help deafblind people to practice horseback riding as a recreational and therapeutic activity. Horseback riding is a form of therapy which can improve self-esteem and sensation of independence. It has been shown to benefit people with various medical conditions-including autism. However, in the case of deafblind riders, an interpreter must stand by at all times to communicate with the rider by touch. We developed a simple interface that enables deafblind people to enjoy horseback riding while the instructor is remotely providing cues, which improves their independence. Experiments demonstrated that an autistic deafblind individual exhibits similar responses to navigational cues as an unimpaired rider. Motivation is an important factor in therapy, and is frequently determinant of its outcome; therefore, the user attitude toward the therapy methods is key. The answers to questionnaires filled by the rider, family, and the instructor show that our technique gives the rider a greater sense of independence and more joy compared to standard riding where the instructor is walking along with the horse.
Pruks V, Farkhatdinov I, Ryu J-H, 2018, Preliminary Study on Real-Time Interactive Virtual Fixture Generation Method for Shared Teleoperation in Unstructured Environments, 11th International Conference on Haptics - Science, Technology, and Applications (EuroHaptics), Publisher: SPRINGER INTERNATIONAL PUBLISHING AG, Pages: 648-659, ISSN: 0302-9743
Duvernoy B, Farkhatdinov I, Topp S, et al., 2018, Electromagnetic Actuator for Tactile Communication, 11th International Conference on Haptics - Science, Technology, and Applications (EuroHaptics), Publisher: SPRINGER INTERNATIONAL PUBLISHING AG, Pages: 14-24, ISSN: 0302-9743
Farkhatdinov I, Roehri N, Burdet E, 2017, Anticipatory detection of turning in humans for intuitive control of robotic mobility assistance, Bioinspiration and Biomimetics, Vol: 12, ISSN: 1748-3182
Many wearable lower-limb robots for walking assistance have been developed in recent years. However, it remains unclear how they can be commanded in an intuitive and efficient way by their user. In particular, providing robotic assistance to neurologically impaired individuals in turning remains a significant challenge. The control should be safe to the users and their environment, yet yield sufficient performance and enable natural human-machine interaction. Here, we propose using the head and trunk anticipatory behaviour in order to detect the intention to turn in a natural, non-intrusive way, and use it for triggering turning movement in a robot for walking assistance. We therefore study head and trunk orientation during locomotion of healthy adults, and investigate upper body anticipatory behaviour during turning. The collected walking and turning kinematics data are clustered using the k-means algorithm and cross-validation tests and k-nearest neighbours method are used to evaluate the performance of turning detection during locomotion. Tests with seven subjects exhibited accurate turning detection. Head anticipated turning by more than 400–500 ms in average across all subjects. Overall, the proposed method detected turning 300 ms after its initiation and 1230 ms before the turning movement was completed. Using head anticipatory behaviour enabled to detect turning faster by about 100 ms, compared to turning detection using only pelvis orientation measurements. Finally, it was demonstrated that the proposed turning detection can improve the quality of human–robot interaction by improving the control accuracy and transparency.
Stone A, Farkhatdinov I, 2017, Robotics Education for Children at Secondary School Level and Above, 18th Annual Conference on Towards Autonomous Robotics (TAROS), Publisher: SPRINGER INTERNATIONAL PUBLISHING AG, Pages: 576-585, ISSN: 0302-9743
Huang H-Y, Farkhatdinov I, Arami A, et al., 2017, Modelling Neuromuscular Function of SCI Patients in Balancing, 3rd International Conference on NeuroRehabilitation (ICNR), Publisher: SPRINGER INTERNATIONAL PUBLISHING AG, Pages: 355-359, ISSN: 2195-3562
Wilhelm E, Mace M, Takagi A, et al., 2016, Investigating Tactile Sensation in the Hand Using a Robot-Based Tactile Assessment Tool, 10th International Conference on Haptics - Perception, Devices, Control, and Applications (EuroHaptics), Publisher: SPRINGER INTERNATIONAL PUBLISHING AG, Pages: 17-24, ISSN: 0302-9743
Ogrinc M, Farkhatdinov I, Walker R, et al., 2016, Deaf-Blind Can Practise Horse Riding with the Help of Haptics, 10th International Conference on Haptics - Perception, Devices, Control, and Applications (EuroHaptics), Publisher: SPRINGER INTERNATIONAL PUBLISHING AG, Pages: 452-461, ISSN: 0302-9743
This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.