TY - JOUR
T1 - Human-centred design and fabrication of a wearable multimodal visual assistance system
AU - Tang, Jian
AU - Zhu, Yi
AU - Jiang, Gai
AU - Xiao, Lin
AU - Ren, Wei
AU - Zhou, Yu
AU - Gu, Qinying
AU - Yan, Biao
AU - Zhang, Jiayi
AU - Bi, Hengchang
AU - Wu, Xing
AU - Fan, Zhiyong
AU - Gu, Leilei
N1 - Publisher Copyright:
© The Author(s), under exclusive licence to Springer Nature Limited 2025.
PY - 2025/4
Y1 - 2025/4
N2 - Artificial intelligence-powered wearable electronic systems offer promising solutions for non-invasive visual assistance. However, state-of-the-art systems have not sufficiently considered human adaptation, resulting in a low adoption rate among blind people. Here we present a human-centred, multimodal wearable system that advances usability by blending software and hardware innovations. For software, we customize the artificial intelligence algorithm to match the requirements of application scenario and human behaviours. For hardware, we improve the wearability by developing stretchable sensory-motor artificial skins to complement the audio feedback and visual tasks. Self-powered triboelectric smart insoles align real users with virtual avatars, supporting effective training in carefully designed scenarios. The harmonious corporation of visual, audio and haptic senses enables significant improvements in navigation and postnavigation tasks, which are experimentally evidenced by humanoid robots and participants with visual impairment in both virtual and real environments. Postexperiment surveys highlight the system’s reliable functionality and high usability. This research paves the way for user-friendly visual assistance systems, offering alternative avenues to enhance the quality of life for people with visual impairment.
AB - Artificial intelligence-powered wearable electronic systems offer promising solutions for non-invasive visual assistance. However, state-of-the-art systems have not sufficiently considered human adaptation, resulting in a low adoption rate among blind people. Here we present a human-centred, multimodal wearable system that advances usability by blending software and hardware innovations. For software, we customize the artificial intelligence algorithm to match the requirements of application scenario and human behaviours. For hardware, we improve the wearability by developing stretchable sensory-motor artificial skins to complement the audio feedback and visual tasks. Self-powered triboelectric smart insoles align real users with virtual avatars, supporting effective training in carefully designed scenarios. The harmonious corporation of visual, audio and haptic senses enables significant improvements in navigation and postnavigation tasks, which are experimentally evidenced by humanoid robots and participants with visual impairment in both virtual and real environments. Postexperiment surveys highlight the system’s reliable functionality and high usability. This research paves the way for user-friendly visual assistance systems, offering alternative avenues to enhance the quality of life for people with visual impairment.
UR - https://www.scopus.com/pages/publications/105002463629
U2 - 10.1038/s42256-025-01018-6
DO - 10.1038/s42256-025-01018-6
M3 - 文章
AN - SCOPUS:105002463629
SN - 2522-5839
VL - 7
SP - 627
EP - 638
JO - Nature Machine Intelligence
JF - Nature Machine Intelligence
IS - 4
M1 - eaat2516
ER -