Imperial College London

DrIldarFarkhatdinov

Faculty of EngineeringDepartment of Bioengineering

Honorary Lecturer
 
 
 
//

Contact

 

i.farkhatdinov Website

 
 
//

Location

 

Royal School of MinesSouth Kensington Campus

//

Summary

 

Publications

Publication Type
Year
to

63 results found

Vitanov I, Farkhatdinov I, Denoun B, Palermo F, Otaran A, Brown J, Omarali B, Abrar T, Hansard M, Oh C, Poslad S, Liu C, Godaba H, Zhang K, Jamone L, Althoefer Ket al., 2021, A Suite of Robotic Solutions for Nuclear Waste Decommissioning, ROBOTICS, Vol: 10

Journal article

Din AR, Althoefer K, Farkhatdinov I, Brown J, Morgan C, Shahdad Set al., 2021, Innovation in the time of SARS-CoV-2: A collaborative journey between NHS clinicians, engineers, academics and industry, SURGEON-JOURNAL OF THE ROYAL COLLEGES OF SURGEONS OF EDINBURGH AND IRELAND, Vol: 19, Pages: E281-E288, ISSN: 1479-666X

Journal article

Otaran A, Farkhatdinov I, 2021, Haptic Ankle Platform for Interactive Walking in Virtual Reality., IEEE Trans Vis Comput Graph, Vol: PP

This paper presents an impedance type ankle haptic interface for providing users with an immersive navigation experience in virtual reality (VR). The ankle platform actuated by an electric motor with feedback control enables the use of foot-tapping gestures to create a walking experience similar to a real one and to haptically render different types of walking terrains. Experimental studies demonstrated that the interface can be easily used to generate virtual walking and it is capable to render terrains such as hard and soft surfaces, and multi-layer complex dynamic terrains. The designed system is a seated-type VR locomotion interface, therefore allowing its user to maintain a stable seated posture to comfortably navigate a virtual scene.

Journal article

Slonina Z, Bonzini AA, Brown J, Wang S, Farkhatdinov I, Althoefer K, Jamone L, Versace Eet al., 2021, Using RoboChick to Identify the Behavioral Features Promoting Social Interactions

Studies of social behaviors in animals are faced with various methodological difficulties, which can be addressed by using controlled artificial social agents. Previous studies have shown that various animal species interact with passive replicas or interactive robots that mimic their conspecifics. In the case of chickens, filial attachment (imprinting) to robots is observed in young chicks. However, the features and functions of the robots that maximize the efficiency of chicken-robot attachment have not yet been identified. Therefore, we designed RoboChick, a simple robot that can be easily customized with different features. Further, we developed a protocol for assessing the attractiveness of each feature. In the current study, we tested the attractiveness of two RoboChick features during robot-chick interactions: the presence of flashing lights and vocalizations in response to chick interactions. Our proposed protocol proved suitable for assessing the efficacy of the features. RoboChick, which is open and modular, can be easily reproduced by other research groups and adapted to test different features in different experimental conditions.

Conference paper

Palermo F, Rincon-Ardila L, Oh C, Althoefer K, Poslad S, Venture G, Farkhatdinov Iet al., 2021, Multi-modal robotic visual-tactile localisation and detection of surface cracks, Pages: 1806-1811, ISSN: 2161-8070

We present and validate a method to detect surface cracks with visual and tactile sensing. The proposed algorithm localises cracks in remote environments through videos/photos taken by an on-board robot camera. The identified areas of interest are then explored by a robot with a tactile sensor. Faster R-CNN object detection is used for identifying the location of potential cracks. Random forest classifier is used for tactile identification of the cracks to confirm their presences. Offline and online experiments to compare vision only and combined vision and tactile based crack detection are demonstrated. Two experiments are developed to test the efficiency of the multi-modal approach: online accuracy detection and time required to explore a surface and localise a crack. Exploring a cracked surface using combined visual and tactile modalities required four times less time than using the tactile modality only. The accuracy of detection was also improved with the combination of the two modalities. This approach may be implemented also in extreme environments since gamma radiation does not interfere with the sensing mechanism of fibre optic-based sensors.

Conference paper

Ratcliffe J, Soave F, Bryan-Kinns N, Tokarchuk L, Farkhatdinov Iet al., 2021, Extended reality (XR) remote research: a survey of drawbacks and opportunities, CHI '21: CHI Conference on Human Factors in Computing Systems, Publisher: ACM, Pages: 1-13

Extended Reality (XR) technology - such as virtual and augmented reality - is now widely used in Human Computer Interaction (HCI), social science and psychology experimentation. However, these experiments are predominantly deployed in-lab with a co-present researcher. Remote experiments, without co-present researchers, have not flourished, despite the success of remote approaches for non-XR investigations. This paper summarises findings from a 30-item survey of 46 XR researchers to understand perceived limitations and benefits of remote XR experimentation. Our thematic analysis identifies concerns common with non-XR remote research, such as participant recruitment, as well as XR-specific issues, including safety and hardware variability. We identify potential positive affordances of XR technology, including leveraging data collection functionalities builtin to HMDs (e.g. hand, gaze tracking) and the portability and reproducibility of an experimental setting. We suggest that XR technology could be conceptualised as an interactive technology and a capable data-collection device suited for remote experimentation.

Conference paper

Sun F, Zang W, Huang H, Farkhatdinov I, Li Yet al., 2021, Accelerometer-Based Key Generation and Distribution Method for Wearable IoT Devices, IEEE INTERNET OF THINGS JOURNAL, Vol: 8, Pages: 1636-1650, ISSN: 2327-4662

Journal article

Huang HY, Farkhatdinov I, Arami A, Bouri M, Burdet Eet al., 2021, Cable-driven robotic interface for lower limb neuromechanics identification, IEEE Transactions on Biomedical Engineering, Vol: 68, Pages: 461-469, ISSN: 0018-9294

This paper presents a versatile cable-driven robotic interface to investigate the single-joint joint neuromechanics of the hip, knee and ankle in the sagittal plane. This endpoint-based interface offers highly dynamic interaction and accurate position control (as is typically required for neuromechanics identification), and provides measurements of position, interaction force and EMG of leg muscles. It can be used with the subject upright, corresponding to a natural posture during walking or standing, and does not impose kinematic constraints on a joint, in contrast to existing interfaces. Mechanical evaluations demonstrated that the interface yields a rigidity above 500 N/m with low viscosity. Tests with a rigid dummy leg and linear springs show that it can identify the mechanical impedance of a limb accurately. A smooth perturbation is developed and tested with a human subject, which can be used to estimate the hip neuromechanics.

Journal article

Otaran A, Farkhatdinov I, 2021, A Short Description of an Ankle-Actuated Seated VR Locomotion Interface, 28th IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR), Publisher: IEEE, Pages: 64-66

Conference paper

Soave F, Farkhatdinov I, Bryan-Kinns N, 2021, Multisensory Teleportation in Virtual Reality Applications, 28th IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR), Publisher: IEEE, Pages: 377-379

Conference paper

Soave F, Kumar AP, Bryan-Kinns N, Farkhatdinov Iet al., 2021, Exploring Terminology for Perception of Motion in Virtual Reality, ACM Designing Interactive Systems Conference (DIS), Publisher: ASSOC COMPUTING MACHINERY, Pages: 171-179

Conference paper

Brown J, Farkhatdinov I, 2021, Shape-Changing Touch Pad based on Particle Jamming and Vibration, IEEE World Haptics Conference (WHC), Publisher: IEEE, Pages: 337-337

Conference paper

Brown J, Farkhatdinov I, 2021, A Soft, Vibrotactile, Shape-Changing Joystick for Telerobotics, IEEE World Haptics Conference (WHC), Publisher: IEEE, Pages: 1158-1158

Conference paper

Otaran A, Farkhatdinov I, 2021, A Cable-Driven Walking Interface with Haptic Feedback for Seated VR, IEEE World Haptics Conference (WHC), Publisher: IEEE, Pages: 592-592

Conference paper

Ratcliffe J, Soave F, Hoover M, Ortega FR, Bryan-Kinns N, Tokarchuk L, Farkhatdinov Iet al., 2021, Remote XR Studies: Exploring Three Key Challenges of Remote XR Experimentation, CHI Conference on Human Factors in Computing Systems, Publisher: ASSOC COMPUTING MACHINERY

Conference paper

Huang H-Y, Farkhatdinov I, Arami A, Bouri M, Burdet Eet al., 2021, Cable-Driven Robotic Interface for Lower Limb Neuromechanics Identification., IEEE Trans. Biomed. Eng., Vol: 68, Pages: 461-469

Journal article

Palermo F, Konstantinova J, Althoefer K, Poslad S, Farkhatdinov Iet al., 2020, Automatic Fracture Characterization Using Tactile and Proximity Optical Sensing, FRONTIERS IN ROBOTICS AND AI, Vol: 7, ISSN: 2296-9144

Journal article

Huang HY, Arami A, Farkhatdinov I, Formica D, Burdet Eet al., 2020, The Influence of Posture, Applied Force and Perturbation Direction on Hip Joint Viscoelasticity, IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, Vol: 28, Pages: 1138-1145, ISSN: 1534-4320

Journal article

Palermo F, Konstantinova J, Althoefer K, Poslad S, Farkhatdinov Iet al., 2020, Implementing Tactile and Proximity Sensing for Crack Detection, IEEE International Conference on Robotics and Automation (ICRA), Publisher: IEEE, Pages: 632-637, ISSN: 1050-4729

Conference paper

Junput B, Farkhatdinov I, Jamone L, 2020, Touch It, Rub It, Feel It! Haptic Rendering of Physical Textures with a Low Cost Wearable System, Pages: 274-286, ISSN: 0302-9743

Information about the texture of an object’s surface is crucial for its recognition and robust manipulation. During robotic teleoperation or interaction with a Virtual Reality, is important to feedback such information to the human user. However, most available solutions for haptic feedback are expensive and/or cumbersome. In this paper we propose a low cost and wearable system that allows users to feel the texture of physical objects by virtually rubbing them. Our main contributions are: i) a system for encoding a virtual representation of the texture of physical materials; ii) a system to haptically render such virtual representation on the user fingertips; iii) an experimental validation of the combined system in a object recognition task. We show that users can successfully recognize physical objects with different textures by virtually rubbing their surfaces using the proposed system.

Conference paper

Brown JP, Farkhatdinov I, 2020, Soft Haptic Interface based on Vibration and Particle Jamming, IEEE Haptics Symposium (HAPTICS), Publisher: IEEE, Pages: 1-6, ISSN: 2324-7347

Conference paper

Omarali B, Denoun B, Althoefer K, Jamone L, Valle M, Farkhatdinov Iet al., 2020, Virtual Reality based Telerobotics Framework with Depth Cameras, 29th IEEE International Conference on Robot and Human Interactive Communication (IEEE RO-MAN), Publisher: IEEE, Pages: 1217-1222, ISSN: 1944-9445

Conference paper

Soave F, Bryan-Kinns N, Farkhatdinov I, 2020, A Preliminary Study on Full-Body Haptic Stimulation on Modulating Self-motion Perception in Virtual Reality, 7th International Conference on Augmented Reality, Virtual Reality and Computer Graphics (SALENTO AVR), Publisher: SPRINGER INTERNATIONAL PUBLISHING AG, Pages: 461-469, ISSN: 0302-9743

Conference paper

Perez NP, Tokarchuk L, Burdet E, Farkhatdinov Iet al., 2019, Exploring user motor behaviour in bimanual interactive video games, ISSN: 2325-4270

Video games have proved very valuable in rehabilitation technologies. They guide therapy and keep patients engaged and motivated. However, in order to realize their full potential, a good understanding is required of the players' motor control. In particular, little is known regarding player behaviour in tasks demanding bimanual interaction. In this work, an experiment was designed to improve the understanding of such tasks. A driving game was developed in which players were asked to guide a differential wheeled robot (depicted as a rocket) along a trajectory. The rocket could be manipulated by using an Xbox controller's triggers, each supplying torque to the corresponding side of the robot. Such a task is redundant, i.e. there exists an infinite number of input combinations to yield a given outcome. This allows the player to strategize according to their own preference. 10 participants were recruited to play this game and their input data was logged for subsequent analysis. Two different motor strategies were identified: an "intermittent" input pattern versus a "continuous" one. It is hypothesized that the choice of behaviour depends on motor skill and minimization of effort and error. Further testing is necessary to determine the exact relationship between these aspects.

Conference paper

Farkhatdinov I, Michalska H, Berthoz A, Hayward Vet al., 2019, Idiothetic Verticality Estimation Through Head Stabilization Strategy, IEEE ROBOTICS AND AUTOMATION LETTERS, Vol: 4, Pages: 2677-2682, ISSN: 2377-3766

Journal article

Farkhatdinov I, Ebert J, van Oort G, Vlutters M, van Asseldonk E, Burdet Eet al., 2019, Assisting human balance in standing with a robotic exoskeleton, IEEE Robotics and Automation Letters, Vol: 4, Pages: 414-421, ISSN: 2377-3766

This letter presents an experimental study on balance recovery control with a lower limb exoskeleton robot. Four participants were subjected to a perturbation during standing, a forward force impulse applied to their pelvis that forced them to step forward with the right leg for balance recovery. Trials with and without exoskeleton assistance to move the stepping legs thigh were conducted to investigate the influence of the exoskeletons control assistance on balancing performance and a potential adaptation. Analysis of the body kinematics and muscle activation demonstrates that robotic assistance: first, was easy to use and did not require learning, nor inhibited the healthy stepping behavior; second, it modified the stepping leg trajectories by increasing hip and knee movement; third, increased reaction speed and decreased the step duration; and finally, generally increased biceps femoris and rectus femoris muscle activity.

Journal article

Farkhatdinov I, Michalska H, Berthozand A, Hayward Vet al., 2019, Gravito-inertial ambiguity resolved through head stabilization, PROCEEDINGS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, Vol: 475, ISSN: 1364-5021

Journal article

Omarali B, Palermo F, Valle M, Poslad S, Althoefer K, Farkhatdinov Iet al., 2019, Position and Velocity Control for Telemanipulation with Interoperability Protocol, Pages: 316-324, ISSN: 0302-9743

In this paper we describe how a generic interoperability telerobotics protocol can be applied for master-slave robotic systems operating in position-position, position-speed and hybrid control modes. The interoperability protocol allows robust and efficient data exchange for teleoperation systems, however it was not shown how it can fit switching position and rate control modes. Here we propose the general framework of hybrid position and rate control modes with interoperability protocol. Furthermore, we demonstrate experimentally that the framework is suitable for robotics teleoperation systems in which a human-operator can switch between position-position and position-speed master and slave robots’ workspace mapping.

Conference paper

Farkhatdinov I, Michalska H, Berthoz A, Hayward Vet al., 2019, Idiothetic Verticality Estimation through Head Stabilization Strategy, 15th Conference on Robotics - Science and Systems, Publisher: MIT PRESS

Conference paper

Danabek D, Otaran A, Althoefer K, Farkhatdinov Iet al., 2019, Mobile robot trajectory analysis with the help of vision system, Pages: 273-279, ISSN: 0302-9743

We present a vision-based motion analysis method for single and multiple mobile robots which allows quantifying the robot's behaviour. The method defines how often and for how much each of the robots turn and move straight. The motion analysis relies on the robot trajectories acquired online or offline by an external camera and the algorithm is based on iteratively performed a linear regression to detect straight and curved paths for each robot. The method is experimentally validated with the indoor mobile robotic system. Potential applications include remote robot inspection, rescue robotics and multi-robotic system coordination.

Conference paper

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.

Request URL: http://wlsprd.imperial.ac.uk:80/respub/WEB-INF/jsp/search-html.jsp Request URI: /respub/WEB-INF/jsp/search-html.jsp Query String: respub-action=search.html&id=00801814&limit=30&person=true