<?xml version="1.0"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
<channel>

<title>David Kotz papers for project 'auracle'</title>
<description>Papers from David Kotz and his research group, about Auracle (wearable eating-detection device).
</description>
<language>en-us</language>
<pubDate>Wed, 08 Apr 2026 07:50:03 +0000</pubDate>
<link>https://www.cs.dartmouth.edu/~kotz/research/project/auracle/index.html</link>
<docs>https://validator.w3.org/feed/docs/rss2.html</docs>
<atom:link href="https://www.cs.dartmouth.edu/~kotz/research/project/auracle/feed.xml" rel="self" type="application/rss+xml"/>

<item>
<title>Analog Gated Recurrent Unit Neural Network for Detecting Chewing Events</title>
<guid>odame:chewing</guid>
<pubDate>Thu, 01 Dec 2022 00:00:00 </pubDate>
<description>
Kofi Odame, Maria Nyamukuru, Mohsen Shahghasemi, Shengjie Bi, and David Kotz.
 &lt;b&gt;Analog Gated Recurrent Unit Neural Network for Detecting Chewing Events.&lt;/b&gt;
 &lt;i&gt;IEEE Transactions on Biomedical Circuits and Systems&lt;/i&gt;, volume&#160;16, number&#160;6, pages&#160;1106&#8211;1115.
 IEEE, December 2022.
 doi:10.1109/TBCAS.2022.3218889.
 &lt;p&gt;&lt;b&gt;Abstract:&lt;/b&gt;
&lt;p&gt;We present a novel gated recurrent neural network to detect when a person is chewing on food. We implemented the neural network as a custom analog integrated circuit in a 0.18 &#956;m CMOS technology. The neural network was trained on 6.4 hours of data collected from a contact microphone that was mounted on volunteers&#8217; mastoid bones. When tested on 1.6 hours of previously-unseen data, the analog neural network identified chewing events at a 24-second time resolution. It achieved a recall of 91% and an F1-score of 94% while consuming 1.1 &#956;W of power. A system for detecting whole eating episodes&#8212; like meals and snacks&#8212; that is based on the novel analog neural network consumes an estimated 18.8 &#956;W of power.&lt;/p&gt;&lt;/p&gt;
 
</description>
<link>https://www.cs.dartmouth.edu/~kotz/research/odame-chewing/index.html</link>
</item>

<item>
<title>Eating detection with a head-mounted video camera</title>
<guid>bi:vision</guid>
<pubDate>Wed, 01 Jun 2022 00:00:00 </pubDate>
<description>
Shengjie Bi and David Kotz.
 &lt;b&gt;Eating detection with a head-mounted video camera.&lt;/b&gt;
 &lt;i&gt;Proceedings of the IEEE International Conference on Healthcare Informatics (ICHI)&lt;/i&gt;, pages&#160;60&#8211;66.
 IEEE, June 2022.
 doi:10.1109/ICHI54592.2022.00021.
 &lt;p&gt;&lt;b&gt;Abstract:&lt;/b&gt;
&lt;p&gt;In this paper, we present a computer-vision based approach to detect eating. Specifically, our goal is to develop a wearable system that is effective and robust enough to automatically detect when people eat, and for how long. We collected video from a cap-mounted camera on 10 participants for about 55 hours in free-living conditions. We evaluated performance of eating detection with four different Convolutional Neural Network (CNN) models. The best model achieved accuracy 90.9% and F1 score 78.7% for eating detection with a 1-minute resolution. We also discuss the resources needed to deploy a 3D CNN model in wearable or mobile platforms, in terms of computation, memory, and power. We believe this paper is the first work to experiment with video-based (rather than image-based) eating detection in free-living scenarios.&lt;/p&gt;&lt;/p&gt;
 
</description>
<link>https://www.cs.dartmouth.edu/~kotz/research/bi-vision/index.html</link>
</item>

<item>
<title>Eating detection with a head-mounted video camera</title>
<guid>bi:video-tr</guid>
<pubDate>Wed, 01 Dec 2021 00:00:00 </pubDate>
<description>
Shengjie Bi and David Kotz.
 &lt;b&gt;Eating detection with a head-mounted video camera.&lt;/b&gt;
 Technical Report number&#160;TR2021-1002, Dartmouth Computer Science, December 2021.
 &lt;p&gt;&lt;b&gt;Abstract:&lt;/b&gt;
&lt;p&gt;In this paper, we present a computer-vision based approach to detect eating. Specifically, our goal is to develop a wearable system that is effective and robust enough to automatically detect when people eat, and for how long. We collected video from a cap-mounted camera on 10 participants for about 55 hours in free-living conditions. We evaluated performance of eating detection with four different Convolutional Neural Network (CNN) models. The best model achieved accuracy 90.9% and F1 score 78.7% for eating detection with a 1-minute resolution. We also discuss the resources needed to deploy a 3D CNN model in wearable or mobile platforms, in terms of computation, memory, and power. We believe this paper is the first work to experiment with video-based (rather than image-based) eating detection in free-living scenarios.&lt;/p&gt;&lt;/p&gt;
 
</description>
<link>https://www.cs.dartmouth.edu/~kotz/research/bi-video-tr/index.html</link>
</item>

<item>
<title>Detection of health-related behaviours using head-mounted devices</title>
<guid>bi:thesis</guid>
<pubDate>Sat, 01 May 2021 00:00:00 </pubDate>
<description>
Shengjie Bi.
 &lt;b&gt;Detection of health-related behaviours using head-mounted devices.&lt;/b&gt;
 PhD thesis, Dartmouth Computer Science, Hanover, NH, May 2021.
 PhD Dissertation.
 &lt;p&gt;&lt;b&gt;Abstract:&lt;/b&gt;
&lt;p&gt;The detection of health-related behaviors is the basis of many mobile-sensing applications for healthcare and can trigger other inquiries or interventions. Wearable sensors have been widely used for mobile sensing due to their ever-decreasing cost, ease of deployment, and ability to provide continuous monitoring. In this dissertation, we develop a generalizable approach to sensing eating-related behavior. &lt;/p&gt;&lt;p&gt; First, we developed Auracle, a wearable earpiece that can automatically detect eating episodes. Using an off-the-shelf contact microphone placed behind the ear, Auracle captures the sound of a person chewing as it passes through the head. This audio data is then processed by a custom circuit board. We collected data with 14 participants for 32 hours in free-living conditions and achieved accuracy exceeding 92.8% and F1 score exceeding77.5% for eating detection with 1-minute resolution. &lt;/p&gt;&lt;p&gt; Second, we adapted Auracle for measuring children&#8217;s eating behavior, and improved the accuracy and robustness of the eating-activity detection algorithms. We used this improved prototype in a laboratory study with a sample of 10 children for 60 total sessions and collected 22.3 hours of data in both meal and snack scenarios. Overall, we achieved 95.5% accuracy and 95.7% F1 score for eating detection with 1-minute resolution. &lt;/p&gt;&lt;p&gt; Third, we developed a computer-vision approach for eating detection in free-living scenarios. Using a miniature head-mounted camera, we collected data with 10 participants for about 55 hours. The camera was fixed under the brim of a cap, pointing to the mouth of the wearer and continuously recording video (but not audio) throughout their normal daily activity. We evaluated performance for eating detection using four different Convolutional Neural Network (CNN) models. The best model achieved 90.9% accuracy and 78.7%F1 score for eating detection with 1-minute resolution. Finally, we validated the feasibility of deploying the 3D CNN model in wearable or mobile platforms when considering computation, memory, and power constraints.&lt;/p&gt;&lt;/p&gt;
 
</description>
<link>https://www.cs.dartmouth.edu/~kotz/research/bi-thesis/index.html</link>
</item>

<item>
<title>Measuring children&#8217;s eating behavior with a wearable device</title>
<guid>bi:children</guid>
<pubDate>Tue, 01 Dec 2020 00:00:00 </pubDate>
<description>
Shengjie Bi, Yiyang Lu, Nicole Tobias, Ella Ryan, Travis Masterson, Sougata Sen, Ryan Halter, Jacob Sorber, Diane Gilbert-Diamond, and David Kotz.
 &lt;b&gt;Measuring children&#8217;s eating behavior with a wearable device.&lt;/b&gt;
 &lt;i&gt;Proceedings of the IEEE International Conference on Healthcare Informatics (ICHI)&lt;/i&gt;.
 IEEE, December 2020.
 doi:10.1109/ICHI48887.2020.9374304.
 &lt;p&gt;&lt;b&gt;Abstract:&lt;/b&gt;
&lt;p&gt;Poor eating habits in children and teenagers can lead to obesity, eating disorders, or life-threatening health problems. Although researchers have studied children&#8217;s eating behavior for decades, the research community has had limited technology to support the observation and measurement of fine-grained details of a child&#8217;s eating behavior. In this paper, we present the feasibility of adapting the Auracle, an existing research-grade earpiece designed to automatically and unobtrusively recognize eating behavior in adults, for measuring children&#8217;s eating behavior. We identified and addressed several challenges pertaining to monitoring eating behavior in children, paying particular attention to device fit and comfort. We also improved the accuracy and robustness of the eating-activity detection algorithms. We used this improved prototype in a lab study with a sample of 10 children for 60 total sessions and collected 22.3 hours of data in both meal and snack scenarios. Overall, we achieved an accuracy exceeding 85.0% and an F1 score exceeding 84.2% for eating detection with a 3-second resolution, and a 95.5% accuracy and a 95.7% F1 score for eating detection with a 1-minute resolution.&lt;/p&gt;&lt;/p&gt;
 
</description>
<link>https://www.cs.dartmouth.edu/~kotz/research/bi-children/index.html</link>
</item>

<item>
<title>Auracle: Detecting Eating Episodes with an Ear-Mounted Sensor</title>
<guid>bi:ubicomp18</guid>
<pubDate>Sat, 01 Sep 2018 00:00:00 </pubDate>
<description>
Shengjie Bi, Tao Wang, Nicole Tobias, Josephine Nordrum, Shang Wang, George Halvorsen, Sougata Sen, Ronald Peterson, Kofi Odame, Kelly Caine, Ryan Halter, Jacob Sorber, and David Kotz.
 &lt;b&gt;Auracle: Detecting Eating Episodes with an Ear-Mounted Sensor.&lt;/b&gt;
 &lt;i&gt;Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT) (Ubicomp)&lt;/i&gt;, volume&#160;2, number&#160;3, article&#160;92, 27&#160;pages.
 ACM, September 2018.
 doi:10.1145/3264902.
 &lt;p&gt;&lt;b&gt;Abstract:&lt;/b&gt;
&lt;p&gt;In this paper, we propose Auracle, a wearable earpiece that can automatically recognize eating behavior. More specifically, in free-living conditions, we can recognize when and for how long a person is eating. Using an off-the-shelf contact microphone placed behind the ear, Auracle captures the sound of a person chewing as it passes through the bone and tissue of the head. This audio data is then processed by a custom analog/digital circuit board. To ensure reliable (yet comfortable) contact between microphone and skin, all hardware components are incorporated into a 3D-printed behind-the-head framework. We collected field data with 14 participants for 32 hours in free-living conditions and additional eating data with 10 participants for 2 hours in a laboratory setting. We achieved accuracy exceeding 92.8% and F1 score exceeding 77.5% for eating detection. Moreover, Auracle successfully detected 20-24 eating episodes (depending on the metrics) out of 26 in free-living conditions. We demonstrate that our custom device could sense, process, and classify audio data in real time. Additionally, we estimate Auracle can last 28.1 hours with a 110 mAh battery while communicating its observations of eating behavior to a smartphone over Bluetooth.&lt;/p&gt;&lt;/p&gt;
 
</description>
<link>https://www.cs.dartmouth.edu/~kotz/research/bi-ubicomp18/index.html</link>
</item>

<item>
<title>Poster: Auracle --- A Wearable Device for Detecting and Monitoring Eating Behavior</title>
<guid>bi:mobisys17</guid>
<pubDate>Thu, 01 Jun 2017 00:00:00 </pubDate>
<description>
Shengjie Bi, Ellen Davenport, Jun Gong, Ronald Peterson, Joseph Skinner, Kevin Storer, Tao Wang, Kelly Caine, Ryan Halter, David Kotz, Kofi Odame, Jacob Sorber, and Xing-Dong Yang.
 &lt;b&gt;Poster: Auracle --- A Wearable Device for Detecting and Monitoring Eating Behavior.&lt;/b&gt;
 &lt;i&gt;Proceedings of the ACM International Conference on Mobile Systems, Applications, and Services (MobiSys)&lt;/i&gt;, page&#160;176.
 ACM, June 2017.
 doi:10.1145/3081333.3089320.
 &lt;p&gt;&lt;b&gt;Abstract:&lt;/b&gt;
&lt;p&gt;The Auracle aims to be a wearable earpiece that detects eating behavior, to be fielded by health-science researchers in their efforts to study eating behavior and ultimately to develop interventions useful to individuals striving to address chronic disease related to eating.&lt;/p&gt;&lt;/p&gt;
 
</description>
<link>https://www.cs.dartmouth.edu/~kotz/research/bi-mobisys17/index.html</link>
</item>

<item>
<title>Toward a Wearable Sensor for Eating Detection</title>
<guid>bi:wearsys17</guid>
<pubDate>Thu, 01 Jun 2017 00:00:00 </pubDate>
<description>
Shengjie Bi, Tao Wang, Ellen Davenport, Ronald Peterson, Ryan Halter, Jacob Sorber, and David Kotz.
 &lt;b&gt;Toward a Wearable Sensor for Eating Detection.&lt;/b&gt;
 &lt;i&gt;Proceedings of the ACM Workshop on Wearable Systems and Applications (WearSys)&lt;/i&gt;, pages&#160;17&#8211;22.
 ACM, June 2017.
 doi:10.1145/3089351.3089355.
 &lt;p&gt;&lt;b&gt;Abstract:&lt;/b&gt;
&lt;p&gt;Researchers strive to understand eating behavior as a means to develop diets and interventions that can help people achieve and maintain a healthy weight, recover from eating disorders, or manage their diet and nutrition for personal wellness. A major challenge for eating-behavior research is to understand when, where, what, and how people eat. In this paper, we evaluate sensors and algorithms designed to detect eating activities, more specifically, when people eat. We compare two popular methods for eating recognition (based on acoustic and electromyography (EMG) sensors) individually and combined. We built a data-acquisition system using two off-the-shelf sensors and conducted a study with 20 participants. Our preliminary results show that the system we implemented can detect eating with an accuracy exceeding 90.9% while the crunchiness level of food varies. We are developing a wearable system that can capture, process, and classify sensor data to detect eating in real-time.&lt;/p&gt;&lt;/p&gt;
 
</description>
<link>https://www.cs.dartmouth.edu/~kotz/research/bi-wearsys17/index.html</link>
</item>

</channel>
</rss>
