Auracle, a wearable eating-detection device (2016-date)


Related website: [Auracle-project.org]

Related projects: [Amanuensis], [Amulet], [SIMBA], [THaW], [TISH]

Related keywords: [mhealth], [sensors], [wearable]


Summary

In the Auracle project we developed a wearable device that can detect eating and related behaviors, and we explored mechanisms for a person to interact with a head-worn device like the Auracle. The goal was support eating-behavior research by developing a comfortable, unobtrusive, ear-worn sensor to measure eating behavior in free-living conditions.

Our primary contribution was the development of the Auracle wearable device, which used a contact microphone as its primary sensor. In a free-living field test with 14 participants, we achieved accuracy exceeding 92.8% and F1 score exceeding 77.5% for eating detection. Moreover, Auracle successfully detected 20-24 eating episodes (depending on the metrics) out of 26 in free-living conditions. We demonstrate that our custom device could sense, process, and classify audio data in real time. Additionally, we estimate Auracle can last 28.1 hours with a 110 mAh battery while communicating its observations of eating behavior to a smartphone over Bluetooth [bi:ubicomp18]. We have a patent pending on this design [bi:auracle-patent].

More recently, we developed a computer-vision approach for eating detection in free-living scenarios. Using a miniature head-mounted camera, we collected data with 10 participants for about 55 hours. The camera was fixed under the brim of a cap, pointing to the mouth of the wearer and continuously recording video (but not audio) throughout their normal daily activity. We evaluated performance for eating detection using four different Convolutional Neural Network (CNN) models. The best model achieved 90.9% accuracy and 78.7% F1 score for eating detection with 1-minute resolution. Finally, we validated the feasibility of deploying the 3D CNN model in wearable or mobile platforms when considering computation, memory, and power constraints. A paper about this approach is available as a Technical Report [bi:video-tr] and as a conference paper [bi:vision]

We later adapted the Auracle for measuring children's eating behavior. We also improved the accuracy and robustness of the eating-activity detection algorithms. We used this improved prototype in a lab study with 10 children and achieved an accuracy exceeding 85.0% and an F1 score exceeding 84.2% for eating detection with a 3-second resolution, and a 95.5% accuracy and a 95.7% F1 score for eating detection with a 1-minute resolution [bi:children].

For more detail on all the above, see Shengjie Bi's dissertation [bi:thesis].

In 2022, we published an approach using an analog circuit to implement a neural network for chewing detection [odame:chewing].

For more information about the Auracle project, and a complete description of its contributions and papers (not just those including David Kotz and his students), see the Auracle website.

People

Shengjie Bi, Kelly Caine, Diane Gilbert-Diamond, Jun Gong, Ryan Halter, George Halvorsen, David Kotz, Yifang Li, Jesse Lin, Byron Lowens, Yiyang Lu, Travis Masterson, Colin Minor, Josephine Nordrum, Maria Nyamukuru, Kofi Odame, Ronald Peterson, Temi Prioleau, Ella Ryan, Gianna Schneider, Sougata Sen, Emily Sidnam, Brodrick Stigal, Jacob Sorber, Kevin Storer, Nicole Tobias, Dhanashree Vaidya, Marcelino Velasquez, Shang Wang, Peter Wang, Te-yen Wu, Xing-Dong Yang.

Some are included in the photo below, from September 2018:
group photo

Funding and acknowledgements

This research program was supported by the US National Science Foundation under award numbers CNS-1565269 and CNS-1835983 (Dartmouth) and CNS-1565268 and CNS-1835974 (Clemson).

The views and conclusions contained on this site and in its documents are those of the authors and should not be interpreted as necessarily representing the official position or policies, either expressed or implied, of the sponsor. Any mention of specific companies or products does not imply any endorsement by the authors or by the sponsor.


Papers (tagged 'auracle')

This list includes only those including David Kotz as co-author or thesis advisor. For a complete list of Auracle papers, see the Auracle website.

[The list below is also available in BibTeX]

Papers are listed in reverse-chronological order; click an entry to pop up the abstract. For full information and pdf, please click Details link. Follow updates with RSS.

2022:
Kofi Odame, Maria Nyamukuru, Mohsen Shahghasemi, Shengjie Bi, and David Kotz. Analog Gated Recurrent Neural Network for Detecting Chewing Events. IEEE Transactions on Biomedical Circuits and Systems. December 2022. [Details]

We present a novel gated recurrent neural network to detect when a person is chewing on food. We implemented the neural network as a custom analog integrated circuit in a 0.18 μm CMOS technology. The neural network was trained on 6.4 hours of data collected from a contact microphone that was mounted on volunteers’ mastoid bones. When tested on 1.6 hours of previously-unseen data, the analog neural network identified chewing events at a 24-second time resolution. It achieved a recall of 91% and an F1-score of 94% while consuming 1.1 μW of power. A system for detecting whole eating episodes— like meals and snacks— that is based on the novel analog neural network consumes an estimated 18.8 μW of power.

Shengjie Bi and David Kotz. Eating detection with a head-mounted video camera. Proceedings of the IEEE International Conference on Healthcare Informatics. June 2022. [Details]

In this paper, we present a computer-vision based approach to detect eating. Specifically, our goal is to develop a wearable system that is effective and robust enough to automatically detect when people eat, and for how long. We collected video from a cap-mounted camera on 10 participants for about 55 hours in free-living conditions. We evaluated performance of eating detection with four different Convolutional Neural Network (CNN) models. The best model achieved accuracy 90.9% and F1 score 78.7% for eating detection with a 1-minute resolution. We also discuss the resources needed to deploy a 3D CNN model in wearable or mobile platforms, in terms of computation, memory, and power. We believe this paper is the first work to experiment with video-based (rather than image-based) eating detection in free-living scenarios.

2021:
Shengjie Bi and David Kotz. Eating detection with a head-mounted video camera. Technical Report, December 2021. [Details]

In this paper, we present a computer-vision based approach to detect eating. Specifically, our goal is to develop a wearable system that is effective and robust enough to automatically detect when people eat, and for how long. We collected video from a cap-mounted camera on 10 participants for about 55 hours in free-living conditions. We evaluated performance of eating detection with four different Convolutional Neural Network (CNN) models. The best model achieved accuracy 90.9% and F1 score 78.7% for eating detection with a 1-minute resolution. We also discuss the resources needed to deploy a 3D CNN model in wearable or mobile platforms, in terms of computation, memory, and power. We believe this paper is the first work to experiment with video-based (rather than image-based) eating detection in free-living scenarios.

Shengjie Bi. Detection of health-related behaviours using head-mounted devices. PhD thesis, May 2021. PhD Dissertation. [Details]

The detection of health-related behaviors is the basis of many mobile-sensing applications for healthcare and can trigger other inquiries or interventions. Wearable sensors have been widely used for mobile sensing due to their ever-decreasing cost, ease of deployment, and ability to provide continuous monitoring. In this dissertation, we develop a generalizable approach to sensing eating-related behavior.

First, we developed Auracle, a wearable earpiece that can automatically detect eating episodes. Using an off-the-shelf contact microphone placed behind the ear, Auracle captures the sound of a person chewing as it passes through the head. This audio data is then processed by a custom circuit board. We collected data with 14 participants for 32 hours in free-living conditions and achieved accuracy exceeding 92.8% and F1 score exceeding77.5% for eating detection with 1-minute resolution.

Second, we adapted Auracle for measuring children’s eating behavior, and improved the accuracy and robustness of the eating-activity detection algorithms. We used this improved prototype in a laboratory study with a sample of 10 children for 60 total sessions and collected 22.3 hours of data in both meal and snack scenarios. Overall, we achieved 95.5% accuracy and 95.7% F1 score for eating detection with 1-minute resolution.

Third, we developed a computer-vision approach for eating detection in free-living scenarios. Using a miniature head-mounted camera, we collected data with 10 participants for about 55 hours. The camera was fixed under the brim of a cap, pointing to the mouth of the wearer and continuously recording video (but not audio) throughout their normal daily activity. We evaluated performance for eating detection using four different Convolutional Neural Network (CNN) models. The best model achieved 90.9% accuracy and 78.7%F1 score for eating detection with 1-minute resolution. Finally, we validated the feasibility of deploying the 3D CNN model in wearable or mobile platforms when considering computation, memory, and power constraints.


Shengjie Bi, Tao Wang, Nicole Tobias, Josephine Nordrum, Robert Halvorsen, Ron Peterson, Kelly Caine, Xing-Dong Yang, Kofi Odame, Ryan Halter, Jacob Sorber, and David Kotz. System for detecting eating with sensor mounted by the ear. U.S. Patent Application PCT/US2019/044317; Worldwide Patent Application WO2020028481A9, February 1, 2021. Priority date 2018-07-31; Filed 2019-07-31; Amended 2021-02-01; Rejected 2025-07-02. Abandoned. [Details]

A wearable device for detecting eating episodes uses a contact microphone to provide audio signals through an analog front end to an analog-to-digital converter to digitize the audio and provide digitized audio to a processor; and a processor configured with firmware in a memory to extract features from the digitized audio. A classifier determines eating episodes from the extracted features. In embodiments, messages describing the detected eating episodes are transmitted to a cell phone, insulin pump, or camera configured to record video of the wearer's mouth.

2020:
Shengjie Bi, Yiyang Lu, Nicole Tobias, Ella Ryan, Travis Masterson, Sougata Sen, Ryan Halter, Jacob Sorber, Diane Gilbert-Diamond, and David Kotz. Measuring children’s eating behavior with a wearable device. Proceedings of the IEEE International Conference on Healthcare Informatics (ICHI). December 2020. [Details]

Poor eating habits in children and teenagers can lead to obesity, eating disorders, or life-threatening health problems. Although researchers have studied children’s eating behavior for decades, the research community has had limited technology to support the observation and measurement of fine-grained details of a child’s eating behavior. In this paper, we present the feasibility of adapting the Auracle, an existing research-grade earpiece designed to automatically and unobtrusively recognize eating behavior in adults, for measuring children’s eating behavior. We identified and addressed several challenges pertaining to monitoring eating behavior in children, paying particular attention to device fit and comfort. We also improved the accuracy and robustness of the eating-activity detection algorithms. We used this improved prototype in a lab study with a sample of 10 children for 60 total sessions and collected 22.3 hours of data in both meal and snack scenarios. Overall, we achieved an accuracy exceeding 85.0% and an F1 score exceeding 84.2% for eating detection with a 3-second resolution, and a 95.5% accuracy and a 95.7% F1 score for eating detection with a 1-minute resolution.

2018:
Shengjie Bi, Tao Wang, Nicole Tobias, Josephine Nordrum, Shang Wang, George Halvorsen, Sougata Sen, Ronald Peterson, Kofi Odame, Kelly Caine, Ryan Halter, Jacob Sorber, and David Kotz. Auracle: Detecting Eating Episodes with an Ear-Mounted Sensor. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT) (Ubicomp). September 2018. [Details]

In this paper, we propose Auracle, a wearable earpiece that can automatically recognize eating behavior. More specifically, in free-living conditions, we can recognize when and for how long a person is eating. Using an off-the-shelf contact microphone placed behind the ear, Auracle captures the sound of a person chewing as it passes through the bone and tissue of the head. This audio data is then processed by a custom analog/digital circuit board. To ensure reliable (yet comfortable) contact between microphone and skin, all hardware components are incorporated into a 3D-printed behind-the-head framework. We collected field data with 14 participants for 32 hours in free-living conditions and additional eating data with 10 participants for 2 hours in a laboratory setting. We achieved accuracy exceeding 92.8% and F1 score exceeding 77.5% for eating detection. Moreover, Auracle successfully detected 20-24 eating episodes (depending on the metrics) out of 26 in free-living conditions. We demonstrate that our custom device could sense, process, and classify audio data in real time. Additionally, we estimate Auracle can last 28.1 hours with a 110 mAh battery while communicating its observations of eating behavior to a smartphone over Bluetooth.

2017:
Shengjie Bi, Ellen Davenport, Jun Gong, Ronald Peterson, Kevin Storer, Tao Wang, Kelly Caine, Ryan Halter, David Kotz, Kofi Odame, Jacob Sorber, and Xing-Dong Yang. Poster: Auracle --- A Wearable Device for Detecting and Monitoring Eating Behavior. Proceedings of the ACM International Conference on Mobile Systems, Applications, and Services (MobiSys). June 2017. [Details]

The Auracle aims to be a wearable earpiece that detects eating behavior, to be fielded by health-science researchers in their efforts to study eating behavior and ultimately to develop interventions useful to individuals striving to address chronic disease related to eating.

Shengjie Bi, Tao Wang, Ellen Davenport, Ronald Peterson, Ryan Halter, Jacob Sorber, and David Kotz. Toward a Wearable Sensor for Eating Detection. Proceedings of the ACM Workshop on Wearable Systems and Applications (WearSys). June 2017. [Details]

Researchers strive to understand eating behavior as a means to develop diets and interventions that can help people achieve and maintain a healthy weight, recover from eating disorders, or manage their diet and nutrition for personal wellness. A major challenge for eating-behavior research is to understand when, where, what, and how people eat. In this paper, we evaluate sensors and algorithms designed to detect eating activities, more specifically, when people eat. We compare two popular methods for eating recognition (based on acoustic and electromyography (EMG) sensors) individually and combined. We built a data-acquisition system using two off-the-shelf sensors and conducted a study with 20 participants. Our preliminary results show that the system we implemented can detect eating with an accuracy exceeding 90.9% while the crunchiness level of food varies. We are developing a wearable system that can capture, process, and classify sensor data to detect eating in real-time.


[Kotz research]