Paper: Analog Neural Network for Detecting Chewing

Analog chip to detect eating!

Monitoring of food intake and eating habits is important for managing and understanding obesity, diabetes, and eating disorders. It can be cumbersome and tedious for individuals to self-report their eating habits, making wearable devices that automatically monitor and record dietary habits an attractive alternative. The challenge is that these devices store or transmit raw data for offline processing. This is a power-consumptive approach that requires a bulky battery or frequent charging, both of which intrude on the user’s normal daily activities and thus make the devices prone to poor user adherence and acceptance.

In this paper, we present a novel analog integrated circuit long short-term memory (LSTM) neural network for embedded eating event detection that eliminates the need for a power-consumptive analog-to-digital converter (ADC) in devices. Unlike previous analog LSTM implementations, our solution contains no internal DACs, ADCs, pampas or Hadamard multiplications. Our novel approach is based on a current-mode adaptive filter, and it eliminates over 90% of the power requirements of a more conventional solution. This opens up the possibility of unobtrusive, battery-less wearable devices that can be used for long-term monitoring of dietary habits.

You can find this paper along with other publications from the Auracle group on Zotero.

Odame, Kofi, Maria Nyamukuru, Mohsen Shahghasemi, Shengjie Bi, and David Kotz. “Analog Gated Recurrent Neural Network for Detecting Chewing Events.” IEEE Transactions on Biomedical Circuits and Systems 16, no. 6 (December 2022): 1106–15. https://doi.org/10.1109/TBCAS.2022.3218889.

Eating detection with a head-mounted video camera

In this paper, we present a computer-vision based approach to detect eating. Specifically, our goal is to develop a wearable system that is effective and robust enough to automatically detect when people eat, and for how long. We collected video from a cap-mounted camera on 10 participants for about 55 hours in free-living conditions. We evaluated performance of eating detection with four different Convolutional Neural Network (CNN) models. The best model achieved accuracy 90.9% and F1 score 78.7% for eating detection with a 1-minute resolution. We also discuss the resources needed to deploy a 3D CNN model in wearable or mobile platforms, in terms of computation, memory, and power. We believe this paper is the first work to experiment with video-based (rather than image-based) eating detection in free-living scenarios.

To see more from the Auracle research group, check out our publications on Zotero.

Bi, Shengjie and Kotz, David, “Eating detection with a head-mounted video camera” (2021). Computer Science Technical Report TR2021-1002. Dartmouth College. https://digitalcommons.dartmouth.edu/cs_tr/384

New Auracle Dissertation by Byron Lowens

We are proud to announce another Auracle team member’s successful dissertation defense, and to share his doctoral thesis. Dr. Byron Lowens’ dissertation focuses on understanding how to develop privacy control mechanisms that provide adopters (and potential adopters) of wearables with integrated, in-the-moment control over personal information collected by wearables. Lowens describes the four different studies he conducted, on individual preferences on data sharing, the impact of the location of privacy control and decision timing, device-independent interactions to control data privacy, and on noticeability of identified interactions. His findings offer privacy researchers and designers of wearable technologies insight into the future development of wearables.

To learn more, check out Lowens’ dissertation below.

Lowens, Byron M., “Interaction Techniques for In-the-Moment Privacy Control Over Data Generated by Wearable Technologies” (2021). Clemson University Dissertations: 2894. 
https://tigerprints.clemson.edu/all_dissertations/2894

New Auracle Dissertation by Shengjie Bi

PhD dissertation

We are proud to announce Dr. Shengjie Bi’s successful dissertation defense and to share his doctoral thesis. Bi’s dissertation focuses on a generalizable approach to sensing eating-related behavior. Bi describes the creation of Auracle, a wearable earpiece that can automatically detect eating episodes, its adaptation to measure children’s eating behavior, and improvements in eating-activity detection algorithms. Bi also describes the development of a computer-vision approach for eating detection in free-living scenarios.

To learn more, check out Bi’s dissertation below.

Bi, Shengjie, “DETECTION OF HEALTH-RELATED BEHAVIOURS USING HEAD-MOUNTED DEVICES” (2021). Dartmouth College Ph.D Dissertations. 75. 
https://digitalcommons.dartmouth.edu/dissertations/75

Measuring children’s eating behavior with Auracle

The Auracle device previously enabled us to automatically and unobtrusively recognize eating behavior in adults. The Auracle team recognized the need for adapting such technology to measure children’s eating behavior and to bolster research efforts focusing on adolescents’ eating behaviors.

We identified and addressed several challenges pertaining to monitoring eating behavior in children, paying particular attention to device fit and comfort. We also improved the accuracy and robustness of the eating-activity detection algorithms.

Check out the 4-minute video below to see graduate student Shengjie Bi’s presentation of our research at IEEE’s International Conference on Healthcare Informatics (ICHI). To read the paper, check out the link at the bottom of this post.

#NSFStories

Shengjie Bi presents Measuring children’s eating behavior with a wearable device at ICHI.

Shengjie Bi, Yiyang Lu, Nicole Tobias, Ella Ryan, Travis Masterson, Sougata Sen, Ryan Halter, Jacob Sorber, Diane Gilbert-Diamond, and David Kotz. Measuring children’s eating behavior with a wearable device. Proceedings of the IEEE International Conference on Healthcare Informatics (ICHI). IEEE, December 2020. ©Copyright IEEE. DOI: https://doi.org/10.1109/ICHI48887.2020.9374304

Capacitivo: Contact-based Object Recognition on Interactive Fabrics Using Capacitive Sensing

Check out this video showcasing Capacitivo.

What if your tablecloth could recognize what is on the table and provide you with useful information? You’re running out of the house and your tablecloth, of all things, reminds you to take your sunglasses. When you come home, your tablecloth detects whether the plant on it needs to be watered and then later updates your diet tracking app when you pour yourself a glass of apple cider. This could be the future of Capacitivo.

Check out this video showcasing Capacitivo. Unlike prior work that has focused on metallic object recognition, our technique recognizes non-metallic objects such as food, different types of fruits, liquids, and other types of objects that are often found around a home or in a workplace. 

Te-Yen Wu,  Lu Tan,  Yuji Zhang,  Teddy Seyed,  Xing-Dong Yang. Capacitivo: Contact-based Object Recognition on Interactive Fabrics Using Capacitive Sensing. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology (UIST). Oct. 2020,  pp. 649–661. DOI: 10.1145/3379337.3415829

#NSFStories