EHTrack: Earphone-Based Head Tracking via only Acoustic Signals

Linfei Ge, Qian Zhang*, Jin Zhang*, Huangxun Chen

*Corresponding author for this work

Research output: Contribution to journalJournal Articlepeer-review

3 Citations (Scopus)

Abstract

Head tracking is a technique that allows for the measurement and analysis of human focus and attention, thus enhancing the experience of human-computer interaction (HCI). Nevertheless, current solutions relying on vision and motion sensors exhibit limitations in accuracy, user-friendliness, and compatibility with the majority of commercial off-The-shelf (COTS) devices. To overcome these limitations, we present EHTrack, an earphone-based system that achieves head tracking exclusively through acoustic signals. EHTrack employs acoustic sensing to measure the movement of a pair of earphones, subsequently enabling precise head tracking. In particular, a pair of speakers generates a periodically fluctuating sound field, which the user's two earphones detect. By assessing the distance and angle alterations between the earphones and speakers, we propose a model to determine the user's head movement and orientation. Our evaluation results indicate a high degree of accuracy in both head movement tracking, with an average tracking error of 2.98 cm, and head orientation tracking, with an average error of 1.83°. Furthermore, in a deployed exhibition scenario, we attained an accuracy of 89.2% in estimating the user's focus direction.

Original languageEnglish
Pages (from-to)4063-4075
Number of pages13
JournalIEEE Internet of Things Journal
Volume11
Issue number3
DOIs
Publication statusPublished - 1 Feb 2024

Bibliographical note

Publisher Copyright:
© 2014 IEEE.

Keywords

  • Acoustic signal processing
  • human computer interaction
  • signal processing
  • systems
  • user interfaces

Fingerprint

Dive into the research topics of 'EHTrack: Earphone-Based Head Tracking via only Acoustic Signals'. Together they form a unique fingerprint.

Cite this