EchoTouch: Low-power Face-touching Behavior Recognition Using Active Acoustic Sensing on Glasses

  • Kaiyi Guo
  • , Tianyu Wu
  • , Yang Gao
  • , Qian Zhang
  • , Dong Wang*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Accurately recognizing face-touching behavior at any time and place can help prevent potential health risks and improve personal habits. However, there remains a lack of effective methods that can be applied in real-world scenarios. In this paper, we propose EchoTouch, a low-power, unobtrusive active acoustic sensing system for monitoring face-touching behavior. EchoTouch captures features from both sides of the face by emitting and receiving orthogonal ultrasound signals through two pairs of microphones and speakers mounted along the under frame of glasses. Then, a lightweight and multi-task deep learning framework identifies the touch area and determines whether the behavior is intrusive to prevent such actions. Finally, a two-stage irrelevant action filtering mechanism effectively handles various interferences. We evaluate EchoTouch on 20 individuals using 11 different types of face-touching areas. EchoTouch achieves an average accuracy of 92.9%, with an 87.2% accuracy in determining whether the behavior is intrusive. Additionally, in-the-wild evaluations further validate the robustness of EchoTouch. We believe that EchoTouch can serve as an unobtrusive and reliable way to monitor and prevent intrusive face-touching behavior.

Original languageEnglish
Article number31
JournalProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Volume9
Issue number2
DOIs
StatePublished - 18 Jun 2025

Keywords

  • Active acoustic sensing
  • Eye-mounted Wearable
  • Face-touching behavior recognition
  • Low-power

Fingerprint

Dive into the research topics of 'EchoTouch: Low-power Face-touching Behavior Recognition Using Active Acoustic Sensing on Glasses'. Together they form a unique fingerprint.

Cite this