Related website: [Auracle-project.org]
Related projects: [Amanuensis], [Amulet], [SIMBA], [THaW], [TISH]
Related keywords: [mhealth], [sensors], [wearable]
In the Auracle project we developed a wearable device that can detect eating and related behaviors, and we explored mechanisms for a person to interact with a head-worn device like the Auracle. The goal was support eating-behavior research by developing a comfortable, unobtrusive, ear-worn sensor to measure eating behavior in free-living conditions.
Our primary contribution was the development of the Auracle wearable device, which used a contact microphone as its primary sensor. In a free-living field test with 14 participants, we achieved accuracy exceeding 92.8% and F1 score exceeding 77.5% for eating detection. Moreover, Auracle successfully detected 20-24 eating episodes (depending on the metrics) out of 26 in free-living conditions. We demonstrate that our custom device could sense, process, and classify audio data in real time. Additionally, we estimate Auracle can last 28.1 hours with a 110 mAh battery while communicating its observations of eating behavior to a smartphone over Bluetooth [bi:ubicomp18]. We have a patent pending on this design [bi:auracle-patent].
More recently, we developed a computer-vision approach for eating detection in free-living scenarios. Using a miniature head-mounted camera, we collected data with 10 participants for about 55 hours. The camera was fixed under the brim of a cap, pointing to the mouth of the wearer and continuously recording video (but not audio) throughout their normal daily activity. We evaluated performance for eating detection using four different Convolutional Neural Network (CNN) models. The best model achieved 90.9% accuracy and 78.7% F1 score for eating detection with 1-minute resolution. Finally, we validated the feasibility of deploying the 3D CNN model in wearable or mobile platforms when considering computation, memory, and power constraints. A paper about this approach is under review.
We later adapted the Auracle for measuring children's eating behavior. We also improved the accuracy and robustness of the eating-activity detection algorithms. We used this improved prototype in a lab study with 10 children and achieved an accuracy exceeding 85.0% and an F1 score exceeding 84.2% for eating detection with a 3-second resolution, and a 95.5% accuracy and a 95.7% F1 score for eating detection with a 1-minute resolution [bi:children].
For more detail on all the above, see Shengjie Bi's dissertation [bi:thesis].
For more information about the Auracle project, and a complete description of its contributions and papers (not just those including David Kotz and his students), see the Auracle website.
Shengjie Bi, Kelly Caine, Diane Gilbert-Diamond, Jun Gong, Ryan Halter, George Halvorsen, David Kotz, Yifang Li, Jesse Lin, Byron Lowens, Yiyang Lu, Travis Masterson, Colin Minor, Josephine Nordrum, Maria Nyamukuru, Kofi Odame, Ronald Peterson, Temi Prioleau, Ella Ryan, Gianna Schneider, Sougata Sen, Emily Sidnam, Brodrick Stigal, Jacob Sorber, Kevin Storer, Nicole Tobias, Dhanashree Vaidya, Marcelino Velasquez, Shang Wang, Peter Wang, Te-yen Wu, Xing-Dong Yang.Some are included in the photo below, from September 2018:
This research program was supported by the National Science Foundation (NSF) under award numbers CNS-1565269 and CNS-1835983 (Dartmouth) and CNS-1565268 and CNS-1835974 (Clemson).
The views and conclusions contained on this site and in its documents are those of the authors and should not be interpreted as necessarily representing the official position or policies, either expressed or implied, of the sponsor. Any mention of specific companies or products does not imply any endorsement by the authors or by the sponsor.
This list includes only those including David Kotz as co-author or thesis advisor. For a complete list of Auracle papers, see the Auracle website.
[The list below is also available in BibTeX]
Papers are listed in reverse-chronological order. Follow updates with RSS.