TY - JOUR
T1 - EchoTouch
T2 - Low-power Face-touching Behavior Recognition Using Active Acoustic Sensing on Glasses
AU - Guo, Kaiyi
AU - Wu, Tianyu
AU - Gao, Yang
AU - Zhang, Qian
AU - Wang, Dong
N1 - Publisher Copyright:
© 2025 ACM.
PY - 2025/6/18
Y1 - 2025/6/18
N2 - Accurately recognizing face-touching behavior at any time and place can help prevent potential health risks and improve personal habits. However, there remains a lack of effective methods that can be applied in real-world scenarios. In this paper, we propose EchoTouch, a low-power, unobtrusive active acoustic sensing system for monitoring face-touching behavior. EchoTouch captures features from both sides of the face by emitting and receiving orthogonal ultrasound signals through two pairs of microphones and speakers mounted along the under frame of glasses. Then, a lightweight and multi-task deep learning framework identifies the touch area and determines whether the behavior is intrusive to prevent such actions. Finally, a two-stage irrelevant action filtering mechanism effectively handles various interferences. We evaluate EchoTouch on 20 individuals using 11 different types of face-touching areas. EchoTouch achieves an average accuracy of 92.9%, with an 87.2% accuracy in determining whether the behavior is intrusive. Additionally, in-the-wild evaluations further validate the robustness of EchoTouch. We believe that EchoTouch can serve as an unobtrusive and reliable way to monitor and prevent intrusive face-touching behavior.
AB - Accurately recognizing face-touching behavior at any time and place can help prevent potential health risks and improve personal habits. However, there remains a lack of effective methods that can be applied in real-world scenarios. In this paper, we propose EchoTouch, a low-power, unobtrusive active acoustic sensing system for monitoring face-touching behavior. EchoTouch captures features from both sides of the face by emitting and receiving orthogonal ultrasound signals through two pairs of microphones and speakers mounted along the under frame of glasses. Then, a lightweight and multi-task deep learning framework identifies the touch area and determines whether the behavior is intrusive to prevent such actions. Finally, a two-stage irrelevant action filtering mechanism effectively handles various interferences. We evaluate EchoTouch on 20 individuals using 11 different types of face-touching areas. EchoTouch achieves an average accuracy of 92.9%, with an 87.2% accuracy in determining whether the behavior is intrusive. Additionally, in-the-wild evaluations further validate the robustness of EchoTouch. We believe that EchoTouch can serve as an unobtrusive and reliable way to monitor and prevent intrusive face-touching behavior.
KW - Active acoustic sensing
KW - Eye-mounted Wearable
KW - Face-touching behavior recognition
KW - Low-power
UR - https://www.scopus.com/pages/publications/105008552349
U2 - 10.1145/3729481
DO - 10.1145/3729481
M3 - 文章
AN - SCOPUS:105008552349
SN - 2474-9567
VL - 9
JO - Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
JF - Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
IS - 2
M1 - 31
ER -