We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.HC

Change to browse by:

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Human-Computer Interaction

Title: ActSonic: Everyday Activity Recognition on Smart Glasses using Active Acoustic Sensing

Abstract: In this paper, we introduce ActSonic, an intelligent, low-power active acoustic sensing system integrated into eyeglasses. ActSonic is designed to recognize 27 different everyday activities (e.g., eating, drinking, toothbrushing). It only needs a pair of miniature speakers and microphones mounted on each hinge of eyeglasses to emit ultrasonic waves to create an acoustic aura around the body. Based on the position and motion of various body parts, the acoustic signals are reflected with unique patterns captured by the microphone and analyzed by a customized self-supervised deep learning framework to infer the performed activities. ActSonic was deployed in a user study with 19 participants across 19 households to evaluate its efficacy. Without requiring any training data from a new user (leave-one-participant-out evaluation), ActSonic was able to detect 27 activities with an inference resolution of 1 second, achieving an average F1-score of 86.6% in an unconstrained setting and 93.4% in a prompted setting.
Comments: 27 pages, 11 figures
Subjects: Human-Computer Interaction (cs.HC); Emerging Technologies (cs.ET)
Cite as: arXiv:2404.13924 [cs.HC]
  (or arXiv:2404.13924v1 [cs.HC] for this version)

Submission history

From: Saif Mahmud [view email]
[v1] Mon, 22 Apr 2024 07:01:19 GMT (37145kb,D)

Link back to: arXiv, form interface, contact.