Earable Computing: Mobile Computing around the Ear and Head


We believe that "earables" is the next significant milestone in wearable computing. With sensing, signal processing, and communications converging into these devices, we envision a host of new possibilities within the next 5 years. The leap from today’s ear-phones to "earables" would mimmic the transformation from basic-phones to smart phones. Today’s smartphones are hardly a calling device anymore, much like how tomorrow’s earables will hardly be a wireless speaker or microphone (see our vision slides here).

Ongoing Work


Media Coverage


Other Talks:



Earable Computing: A New Area to Think About
Romit Roy Choudhury
ACM HotMobile, Feb 2021.

This position paper argues that earphones hold the potential for major disruptions in mobile, wearable computing.


Personalizing Head Related Transfer Functions for Earables
Zhijian Yang, Romit Roy Choudhury
ACM SIGCOMM, Aug 2021.

UNIQ enables better spatial acoustics on earables by estimating personalized HRTF using COTS mobile devices.


Angle-of-arrival (AOA) Factorization in Multipath Channels
Yu-Lin Wei, Romit Roy Choudhury
IEEE ICASSP, June 2021.

This paper considers the problem of estimating K angle of arrivals (AoA) using an array of M > K microphones, for unknown and correlated source signals.


MUTE: Bringing IoT to Noise Cancellation Short Video Full Talk
Sheng Shen, Nirupam Roy, Junfeng Guan, Haitham Hassanieh, Romit Roy Choudhury
ACM SIGCOMM'18, August 2018.

MUTE exploits the velocity gap between RF and sound to improve active noise cancellation.


Ear-AR: Indoor Acoustic Augmented Reality on Earphones Short Video Full Talk
Zhijian Yang, Yu-Lin Wei, Sheng Shen, Romit Roy Choudhury
ACM MobiCom'20, September 2020.

Ear-AR enables indoor acoustic augmented reality on smart earphones, by using a fusion of ear IMU, phone IMU, and acoustics to do better indoor localization.


EarSense: Earphones as a Teeth Activity Sensor Short Video Full Talk
Jay Prakash, Zhijian Yang, Yu-Lin Wei, Haitham Hassanieh, Romit Roy Choudhury
ACM MobiCom'20, September 2020.

EarSense exploits reusing COTS earphone speakers as microphone to sense and localize a set of gestures made by teeth.


Voice Localization Using Nearby Wall Reflections Short Video Full Talk
Sheng Shen, Daguan Chen, Yu-Lin Wei, Zhijian Yang, Romit Roy Choudhury
ACM MobiCom'20, September 2020.

VoLoc shows the feasibility of inferring indoor user location from acoustic signals, for smart voice assistants like Amazon Alexa and Google Home.


STEAR: Robust Step Counting from Earables
Jay Prakash, Zhijian Yang, Yu-Lin Wei, Romit Roy Choudhury
ACM Earcomp (Workshop with Ubicomp'19), September 2019.

STEAR identifies smart earphone IMU as a much better sensor compared with phone or watch IMU to do step counting due to the much cleaner IMU signals on the ears.

Dataset and Code

Coming soon...


Zhijian Yang

Zhijian Yang
PhD Student
Dept. of CS

Yu-Lin (Wally) Wei

Yu-Lin (Wally) Wei
PhD Student
Dept. of ECE

Liz Li

Ziyue (Liz) Li
MS/PhD Student
Dept. of ECE


Jay Prakash
Visiting Scholar
SUTD, Singapore

Sheng Shen

Sheng Shen
Alumni, ECE PhD 2019
Facebook Reality Labs

Romit Roy Choudhury

Romit Roy Choudhury
Dept. of ECE & CS

We are looking for PhD students with background in sensing, (acoustic) signal processing, communications, embedded systems, and machine learning.