Themes of Work

Our research centres around the body and how technology can be used to improve how that body exists and interacts with the surrounding environment. We focus on haptic and aural modalities, using textiles as the physical medium for building wearable computational systems. Some of the research projects we undertake focus exclusively on textile sensing and interfaces whilst other focus solely on how auditory displays can be improved for users. A growing area of our work is looking towards how these two complementary technologies can be brought together in novel applications.

Below is an non-exhaustive list of some of the research we have undertaken.

Search or filter publications

Filter by type:

Filter by publication type

Filter by year:

to

Results

  • Showing results for:
  • Reset all filters

Search results

  • Conference paper
    Zhou B, Liu M, Bian S, Geiβler D, Lukowicz P, Miranda J, Dan J, Atienza D, Riahi MA, Wehn N, Torah R, Yong S, Liu J, Beeby S, Kohler M, Greinke B, Yu J, Nierstrasz V, Sheldrick L, Stewart R, Nieri T, Maccanti M, Spinelli Det al., 2025,

    Multi-partner project: Sustainable Textile Electronics (STELEC)

    , DATE 2025, Publisher: IEEE, Pages: 1-5

    E-textiles are rapidly emerging as an important area of electronic circuit applications. It also facilitates many socially important applications such as personalized health, elderly care, and smart agriculture. However, the environmental impact and sustainability of e-textiles remain very problematic. STELEC, short for Sustainable Textile ELECtronics, is an interdisciplinary research project funded by the European Innovation Council (EIC) under the Pathfinder programme on the responsible elec-tronics topic seeking cutting-edge innovation. STELEC started in September 2024 and is in its initial stage. The project is a multinational collaboration of research institutes, universities and companies across Europe. It aims at developing next-generation textile-based electronics in applications from sensing, processing to AI, with a commitment to full lifecycle sustainability.

  • Journal article
    Zhou Y, Sun Y, Li Y, Shen C, Lou Z, Min X, Stewart Ret al., 2024,

    A highly durable and UV‐resistant graphene‐based knitted textile sensing sleeve for human joint angle monitoring and gesture differentiation

    , Advanced Intelligent Systems, Vol: 6, ISSN: 2640-4567

    Flexible strain sensors based on textiles have attracted extensive attention owing to their light weight, flexibility, and comfort when wearing. However, challenges in integrating textile strain sensors into wearable sensing devices include the need for outstanding sensing performance, long-term monitoring stability, and fast, convenient integration processes to achieve comprehensive monitoring. The scalable fabrication technique presented here addresses these challenges by incorporating customizable graphene-based sensing networks into knitted structures, thus creating sensing sleeves for precise motion detection and differentiation. The performance and real-world application potential of the sensing sleeve are evaluated by its precision in angle estimation and complex joint motion recognition during intra- and intersubject studies. For intra-subject analysis, the sensing sleeve only exhibits a 2.34° angle error in five different knee activities among 20 participants, and the sensing sleeves show up to 94.1% and 96.1% accuracy in the gesture classification of knee and elbow, respectively. For inter-subject analysis, the sensing sleeve demonstrates a 4.21° angle error, and it shows up to 79.9% and 85.5% accuracy in the gesture classification of knee and elbow, respectively. An activity-guided user interface compatible with the sensing sleeves for human motion monitoring in home healthcare applications is presented to illustrate the potential applications.

  • Journal article
    Lou Z, Min X, Li G, Avery J, Stewart Ret al., 2024,

    Advancing sensing resolution of impedance hand gesture recognition devices

    , IEEE Journal of Biomedical and Health Informatics, Vol: 28, Pages: 5855-5864, ISSN: 2168-2194

    Gestures are composed of motion information (e.g. movements of fingers) and force information (e.g. the force exerted on fingers when interacting with other objects). Current hand gesture recognition solutions such as cameras and strain sensors primarily focus on correlating hand gestures with motion information and force information is seldom addressed. Here we propose a bio-impedance wearable that can recognize hand gestures utilizing both motion information and force information. Compared with previous impedance-based gesture recognition devices that can only recognize a few multi-degrees-of-freedom gestures, the proposed device can recognize 6 single-degree-of-freedom gestures and 20 multiple-degrees-of-freedom gestures, including 8 gestures in 2 force levels. The device uses textile electrodes, is benchmarked over a selected frequency spectrum, and uses a new drive pattern. Experimental results show that 179 kHz achieves the highest signal-to-noise ratio (SNR) and reveals the most distinct features. By analyzing the 49,920 samples from 6 participants, the device is demonstrated to have an average recognition accuracy of 98.96%. As a comparison, the medical electrodes achieved an accuracy of 98.05%.

  • Conference paper
    Wang M, Zhou Y, Stewart R, 2024,

    Soft wearable robotics: innovative knitting-integrated approaches for pneumatic actuators design

    , DIS '24: Designing Interactive Systems Conference, Publisher: ACM, Pages: 234-238

    Soft wearable robotics presents an opportunity to bridge robotics and textiles, offering lightweight, flexible, and ergonomic solutions for human-robot interaction, but previous studies on wearable soft robotics primarily focus on actuator performance without also considering wearability and interactivity. A rudimentary attachment method is usually adopted using external fixation devices such as straps to attach actuators to the user’s body, resulting in a poor wearing experience. This study focus on compatible and compact textile architectures to support actuators to be seamlessly integrated into daily wearing. It presents a research-through-design method to propose innovative knitting-integrated approaches for pneumatic actuator design to provide soft wearable robots with both aesthetic and functional values. Through a series of tests in which various knitting techniques and parameters are used to create sleeves that house silicone actuators, it explores design possibilities and understands the complex relationships between textiles and actuators. The findings contribute to advancing soft wearable robotics by offering practical solutions for integrating pneumatic actuators seamlessly into wearable textiles, thereby unlocking new possibilities for human-centered robotic systems.

  • Conference paper
    Li Y, Zhou Y, Shen C, Stewart Ret al., 2024,

    E-textile sleeve with graphene strain sensors for arm gesture classification of mid-air interactions

    , TEI '24: Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction, Publisher: ACM, Pages: 1-10

    Arm gestures play a pivotal role in facilitating natural mid-air interactions. While computer vision techniques aim to detect these gestures, they encounter obstacles like obfuscation and lighting conditions. Alternatively, wearable devices have leveraged interactive textiles to recognize arm gestures. However, these methods predominantly emphasize textile deformation-based interactions, like twisting or grasping the sleeve, rather than tracking the natural body movement.This study bridges this gap by introducing an e-textile sleeve system that integrates multiple ultra-sensitive graphene e-textile strain sensors in an arrangement that captures bending and twisting along with an inertia measurement unit into a sports sleeve. This paper documents a comprehensive overview of the sensor design, fabrication process, seamless interconnection method, and detachable hardware implementation that allows for reconfiguring the processing unit to other body parts. A user study with ten participants demonstrated that the system could classify six different fundamental arm gestures with over 90% accuracy.

  • Conference paper
    Dave RJ, Min X, Lou Z, Stewart Ret al., 2024,

    Investigating construction and integration techniques of dry silver-based textile electrodes on electromyography of biceps Brachii muscle

    , 5th International Conference on the Challenges, Opportunities, Innovations and Applications in Electronic Textiles, Publisher: MDPI, ISSN: 2673-4591

    This research paper recommends an electrode construction and integration technique for dry silver-based textile electrodes capturing electromyographic (EMG) signals. Three integration methods with two different conductive textiles were compared using two analysis methods; analysis was also conducted before and after six washing cycles. Six wearable arm bands with each of the design parameter combinations were worn on the biceps brachii muscle to capture EMG signals from three users under a controlled task both before any washing of the bands occurred and after four washing cycles were completed. Additionally, impedance measurements over six frequency bands were recorded after each washing cycle. Textile electrodes made of Shieldex Techniktex P180B using an extended electrode integration method were found to perform best.

  • Conference paper
    Zhang M, Stewart R, Bryan-Kinns N, 2024,

    Empowering textile and fashion designers with e-textiles for creative expression

    , 5th International Conference on the Challenges, Opportunities, Innovations and Applications in Electronic Textiles, Publisher: MDPI, ISSN: 2673-4591

    In the field of textile and fashion design, there is a growing desire to integrate interactive technologies into creative work. Traditional design education typically lacks support for material-oriented designers to develop electronic skills alongside their expertise in materials. There is a need to develop proper support for these designers to enter the world of electronic textiles (e-textiles). Our previous work introduced a material-centred e-textile learning approach through the development of a toolkit. This paper offers a glimpse into a design project made by our students, where digital functionality intertwines with physical design. It serves as a testament to the effectiveness of our approach in merging interactive technology concepts with material expertise, thereby aiding these designers in their creative endeavours.

  • Journal article
    Aziz N, Stockman T, Stewart R, 2022,

    Planning your journey in audio: design and evaluation of auditory route overviews

    , ACM Trans. Access. Comput., Vol: 15, Pages: 1-48, ISSN: 1936-7228

    Auditory overviews of routes can provide routing and map information to blind users enabling them to preview route maps before embarking on a journey. This paper investigates the usefulness of a system designed to do this through a Preliminary Survey, followed by a Design Study to gather the design requirements, development of a prototype and evaluation through a Usability Study. The design is drawn in 2-stages with 8 audio designers and 8 potential blind users. The auditory route overview is sequential and automatically generated as integrated audio. It comprises auditory icons to represent points of interest, earcons for auditory brackets encapsulating repeating points of interest, and speech for directions. A prototype based on this design is developed and evaluated with 22 sighted and 8 blind participants. The software architecture of the prototype including the route information retrieval and mapping onto audio has been included. The findings show that both groups perform well in route reconstruction and recognition tasks. Moreover, the functional route information and auditory icons are effectively designed and useful in forming a mental model of the route, which improves over time. However, the design of auditory brackets needs further improvement and testing. At all stages of the system development, input has been acquired from the end-user population and the design is adapted accordingly.

  • Journal article
    Zhou Y, Stewart R, 2022,

    Highly flexible, durable, UV resistant, and electrically conductive graphene based TPU/textile composite sensor

    , Polymers for Advanced Technologies, Vol: 33, Pages: 4250-4264, ISSN: 1042-7147

    Flexible strain sensors have attracted considerable attention due to their applications in wearable monitoring fields such as human-computer interaction systems, athletic training, and health systems. Textiles are a desired substrate for fabricating wearable flexible sensors due to their light weight, comfort, and flexibility. However, the compatibility between textiles and conductive materials still faces critical challenges, especially for wearable sensors to achieve high sensitivity and a wide sensing range simultaneously with long-term monitoring stability, reliability, and wearing comfort. In this study, we propose a graphene-based TPU/textile composite sensor that can be produced using small-scale manufacturing techniques, using laser cutting combined with film coating and thermal transfer processes and further explore its mechanical, electrical, and sensing properties. Since the human body exhibits different magnitudes of motion and fabric sensors integrated into clothing would face multiple challenges in real world usage e.g. repetitive wear, sweat and sunlight exposure, we performed sensitivity, reliability and durability tests to further evaluate real world usage of the fabric sensors. The developed composite sensor exhibits a high sensitivity (GF = 498), wide sensing range (0%–293%), excellent reliability and stability which only shows 5% deviation after 10,000 cycles of stretching under 5% strain. In addition, the graphene-based textile composite sensor thermalised by TPU film can also maintain high stability after long-term UV irradiation and multiple washing cycles. When integrated into various wearable devices, our composite sensor can detect a wide range of human body motions accurately, as well as subtle physiological signals, exhibiting great potential in incorporating into wearable monitoring devices.

  • Journal article
    Mao A, Giraudet CSE, Liu K, De Almeida Nolasco I, Xie Z, Xie Z, Gao Y, Theobald J, Bhatta D, Stewart R, McElligott AGet al., 2022,

    Automated identification of chicken distress vocalizations using deep learning models.

    , Journal of the Royal Society Interface, Vol: 19, Pages: 1-11, ISSN: 1742-5662

    The annual global production of chickens exceeds 25 billion birds, which are often housed in very large groups, numbering thousands. Distress calling triggered by various sources of stress has been suggested as an 'iceberg indicator' of chicken welfare. However, to date, the identification of distress calls largely relies on manual annotation, which is very labour-intensive and time-consuming. Thus, a novel convolutional neural network-based model, light-VGG11, was developed to automatically identify chicken distress calls using recordings (3363 distress calls and 1973 natural barn sounds) collected on an intensive farm. The light-VGG11 was modified from VGG11 with significantly fewer parameters (9.3 million versus 128 million) and 55.88% faster detection speed while displaying comparable performance, i.e. precision (94.58%), recall (94.89%), F1-score (94.73%) and accuracy (95.07%), therefore more useful for model deployment in practice. To additionally improve light-VGG11's performance, we investigated the impacts of different data augmentation techniques (i.e. time masking, frequency masking, mixed spectrograms of the same class and Gaussian noise) and found that they could improve distress calls detection by up to 1.52%. Our distress call detection demonstration on continuous audio recordings, shows the potential for developing technologies to monitor the output of this call type in large, commercial chicken flocks.

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.

Request URL: http://www.imperial.ac.uk:80/respub/WEB-INF/jsp/search-t4-html.jsp Request URI: /respub/WEB-INF/jsp/search-t4-html.jsp Query String: id=1332&limit=10&resgrpMemberPubs=true&respub-action=search.html Current Millis: 1764917190429 Current Time: Fri Dec 05 06:46:30 GMT 2025