SonicASL: An Acoustic-based Sign Language Gesture Recognizer Using Earphones

  • Yincheng Jin
  • , Yang Gao
  • , Yanjun Zhu
  • , Wei Wang
  • , Jiyang Li
  • , Seokmin Choi
  • , Zhangyu Li
  • , Jagmohan Chauhan
  • , Anind K. Dey
  • , Zhanpeng Jin*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

49 Scopus citations

Abstract

We propose SonicASL, a real-time gesture recognition system that can recognize sign language gestures on the fly, leveraging front-facing microphones and speakers added to commodity earphones worn by someone facing the person making the gestures. In a user study (N=8), we evaluate the recognition performance of various sign language gestures at both the word and sentence levels. Given 42 frequently used individual words and 30 meaningful sentences, SonicASL can achieve an accuracy of 93.8% and 90.6% for word-level and sentence-level recognition, respectively. The proposed system is tested in two real-world scenarios: indoor (apartment, office, and corridor) and outdoor (sidewalk) environments with pedestrians walking nearby. The results show that our system can provide users with an effective gesture recognition tool with high reliability against environmental factors such as ambient noises and nearby pedestrians.

Original languageEnglish
Article number3463519
JournalProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Volume5
Issue number2
DOIs
StatePublished - Jun 2021
Externally publishedYes

Keywords

  • Acoustic sensing
  • earphones
  • sign language gesture recognition

Fingerprint

Dive into the research topics of 'SonicASL: An Acoustic-based Sign Language Gesture Recognizer Using Earphones'. Together they form a unique fingerprint.

Cite this