Human-centred design and fabrication of a wearable multimodal visual assistance system

Jian Tang, Yi Zhu, Gai Jiang, Lin Xiao, Wei Ren, Yu Zhou, Qinying Gu, Biao Yan, Jiayi Zhang, Hengchang Bi, Xing Wu, Zhiyong Fan, Leilei Gu

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

Artificial intelligence-powered wearable electronic systems offer promising solutions for non-invasive visual assistance. However, state-of-the-art systems have not sufficiently considered human adaptation, resulting in a low adoption rate among blind people. Here we present a human-centred, multimodal wearable system that advances usability by blending software and hardware innovations. For software, we customize the artificial intelligence algorithm to match the requirements of application scenario and human behaviours. For hardware, we improve the wearability by developing stretchable sensory-motor artificial skins to complement the audio feedback and visual tasks. Self-powered triboelectric smart insoles align real users with virtual avatars, supporting effective training in carefully designed scenarios. The harmonious corporation of visual, audio and haptic senses enables significant improvements in navigation and postnavigation tasks, which are experimentally evidenced by humanoid robots and participants with visual impairment in both virtual and real environments. Postexperiment surveys highlight the system’s reliable functionality and high usability. This research paves the way for user-friendly visual assistance systems, offering alternative avenues to enhance the quality of life for people with visual impairment.

Original languageEnglish
Article numbereaat2516
Pages (from-to)627-638
Number of pages12
JournalNature Machine Intelligence
Volume7
Issue number4
DOIs
StatePublished - Apr 2025

Fingerprint

Dive into the research topics of 'Human-centred design and fabrication of a wearable multimodal visual assistance system'. Together they form a unique fingerprint.

Cite this