We believe "earables" is the next significant milestone in wearable computing. With sensing, processing, and communications converging into these devices, we envision a host of new possibilities within the next 5 years. The leap from today’s ear-phones to "earables" would mimmic the transformation from basic-phones to smartphones. Today’s smartphones are hardly a calling device, much like how tomorrow’s earables will hardly be a wireless speaker or microphone. See our vision slides


We are currently working on


Invited Seminars and Talks

Invited seminar at 2020 NUS Computer Science Week, Singapore
Workshop talk on Earable Computing, ACM HotMobile, 2021
Conference talk on AoA Factorization, ICASSP 2021
Conference talk on Ear-AR, ACM MobiCom 2020
Conference talk on EarSense, ACM MobiCom 2020
Conference talk on UNIQ, ACM SIGCOMM 2020
Conference talk on VoLoc, ACM MobiCom 2020
Conference talk on MUTE, ACM SIGCOMM 2018
Invited seminar at Charles Babbage Seminar (Cambridge Univ.)
Keynote at EarComp workshop 2019
Keynote at MobiUK workshop (Oxford Univ.)


Earable Computing: A New Area to Think About

This position paper argues that earphones hold the potential to disrupt mobile, wearable computing.
ACM HotMobile, 2021.

Estimating multiple Angles of Arrival in a Steering Vector Space

This paper estimates the AoA of multiple uncorrelated and correlated signals (echoes) by analyzing them in a steering-vector sub-space.
arXiv, Sep 2021.

Personalizing Head Related Transfer Functions for Earables

UNIQ enables better spatial acoustics on earables by estimating personalized HRTF using off-the-shelf mobile devices.

Angle-of-arrival (AOA) Factorization in Multipath Channels

TThis paper aims to estimate K angle of arrivals (AoA) using an array of M > K microphones, for unknown and correlated source signals.

Ear-AR: Indoor Acoustic Augmented Reality on Earphones

Ear-AR enables indoor localization and acoustic AR via sensor fusion between ear IMU, phone IMU, and acoustics.

EarSense: Earphones as a Teeth Activity Sensor

EarSense uses today’s earphone speaker and microphone to sense and localize teeth gestures (applications in health monitoring and HCI)
ACM MobiCom'20, 2020.

Voice Localization Using Nearby Wall Reflections

VoLoc shows the feasibility of inferring user location from voice commands, useful for voice assistants like Amazon Alexa and Google Home.
ACM MobiCom'20, 2020.

STEAR: Robust Step Counting from Earables

STEAR discusses why earphone IMU serves as a much better sensor than phone or watch IMU, for motion tracking applications.
ACM Earcomp (Workshop with Ubicomp'19), 2019.

MUTE: Bringing IoT to Noise Cancellation

MUTE exploits the velocity gap between RF and sound to improve active noise cancellation.
Some of our work covered by


Earable computing

a new research area in the making

The future of AR is

earbuds, not eyeglasses

Project EarSense

Finally, apps to sink your teeth into

Get to know us

Yu-Lin (Wally) Wei
PhD Student, UIUC
Zhijian Yang
PhD Student, UIUC
Hyungjoo Seo
PhD Student, UIUC
Rajalaxmi Rajagopalan
PhD Student, UIUC
Zhongweiyang (Alan) Xu
MS Student, UIUC
Avinash Subramaniam
MS Student, UIUC
Sahil Bhandary Karnoor
MS Student, UIUC
Eric Dong
Undergraduate Student, UIUC
Jaewook Lee
Undergraduate Student, UIUC
Bashima Islam
Postdoct, UIUC
Romit Roy Choudhury
Professor, ECE & CS, UIUC

Collaborators, past and present

Ziyue (Liz) Li
Alumni 2020, UIUC
Waymo @ Google
Sheng Shen
Alumni 2019, UIUC
Facebook Reality Labs
Jay Prakash
Visitng Scholar, SUTD
Haitham Hassanieh
Asst. Professor, ECE & CS, UIUC
Rakesh Kumar
Professor, ECE, UIUC
We are looking for PhD students with background in sensing, (acoustic) signal processing, communications, embedded systems, and machine learning.

Copyright © All rights reserved | This template is made by Colorlib