Eating detection with a head-mounted video camera

In this paper, we present a computer-vision based approach to detect eating. Specifically, our goal is to develop a wearable system that is effective and robust enough to automatically detect when people eat, and for how long. We collected video from a cap-mounted camera on 10 participants for about 55 hours in free-living conditions. We evaluated performance of eating detection with four different Convolutional Neural Network (CNN) models. The best model achieved accuracy 90.9% and F1 score 78.7% for eating detection with a 1-minute resolution. We also discuss the resources needed to deploy a 3D CNN model in wearable or mobile platforms, in terms of computation, memory, and power. We believe this paper is the first work to experiment with video-based (rather than image-based) eating detection in free-living scenarios.

To see more from the Auracle research group, check out our publications on Zotero.

Bi, Shengjie and Kotz, David, “Eating detection with a head-mounted video camera” (2021). Computer Science Technical Report TR2021-1002. Dartmouth College. https://digitalcommons.dartmouth.edu/cs_tr/384

Capacitivo: Contact-based Object Recognition on Interactive Fabrics Using Capacitive Sensing

Check out this video showcasing Capacitivo.

What if your tablecloth could recognize what is on the table and provide you with useful information? You’re running out of the house and your tablecloth, of all things, reminds you to take your sunglasses. When you come home, your tablecloth detects whether the plant on it needs to be watered and then later updates your diet tracking app when you pour yourself a glass of apple cider. This could be the future of Capacitivo.

Check out this video showcasing Capacitivo. Unlike prior work that has focused on metallic object recognition, our technique recognizes non-metallic objects such as food, different types of fruits, liquids, and other types of objects that are often found around a home or in a workplace. 

Te-Yen Wu,  Lu Tan,  Yuji Zhang,  Teddy Seyed,  Xing-Dong Yang. Capacitivo: Contact-based Object Recognition on Interactive Fabrics Using Capacitive Sensing. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology (UIST). Oct. 2020,  pp. 649–661. DOI: 10.1145/3379337.3415829

#NSFStories

Tessutivo: Contextual Interactions on Interactive Fabrics with Inductive Sensing

The Auracle team recently presented its latest work at UIST.

The Auracle team recently presented its latest work at UIST, the ACM Symposium on User Interface Software and Technology.75224807_10158861719604027_4236863062864297984_n.jpg

Abstract: We present Tessutivo, a contact-based inductive sensing technique for contextual interactions on interactive fabrics. Our technique recognizes conductive objects (mainly metallic) that are commonly found in households and workplaces, such as keys, coins, and electronic devices. We built a prototype containing six by six spiral-shaped coils made of conductive thread, sewn onto a four-layer fabric structure. We carefully designed the coil shape parameters to maximize the sensitivity based on a new inductance approximation formula. Through a ten-participant study, we evaluated the performance of our proposed sensing technique across 27 common objects. We yielded 93.9% real-time accuracy for object recognition. We conclude by presenting several applications to demonstrate the unique interactions enabled by our technique.

DOI: 10.1145/3332165.3347897

Auracle overview (video)

David Kotz recently presented an invited webinar lecture that provides an overview of the Auracle and results of some of our experiments in validating Auracle.

David Kotz recently presented an invited webinar lecture in the Mobile Data to Knowledge (MD2K) program.  The second half of that lecture (starting at 23:29) provides an overview of the Auracle and results of some of our experiments in validating Auracle.  (The first half describes the Amulet project – also worth checking out!)

Indutivo, at UIST’18

The Auracle team presented an innovative sensing technology for interacting with wearable devices, during the UIST 2018 in Berlin, Germany.

indutivoAbstract: We present Indutivo, a contact-based inductive sensing technique for contextual interactions. Our technique recognizes conductive objects (metallic primarily) that are commonly found in households and daily environments, as well as their individual movements when placed against the sensor. These movements include sliding, hinging, and rotation. We describe our sensing principle and how we designed the size, shape, and layout of our sensor coils to optimize sensitivity, sensing range, recognition and tracking accuracy. Through several studies, we also demonstrated the performance of our proposed sensing technique in environments with varying levels of noise and interference conditions. We conclude by presenting demo applications on a smartwatch, as well as insights and lessons we learned from our experience.

Read the full paper in ACM digital library:

Jun Gong, Xin Yang, Teddy Seyed, Josh Urban Davis, and Xing-Dong Yang. 2018. Indutivo: Contact-Based, Object-Driven Interactions with Inductive Sensing. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology (UIST ’18). ACM, pp.321-333. DOI: https://doi.org/10.1145/3242587.3242662

Watch the presentation:

Pyro: Thumb-Tip Gesture Recognition Using Pyroelectric Infrared Sensing

The Auracle team presented an innovative new technique for interacting with wearable devices, during the UIST conference last week.

uistf2235-file2Abstract: We present Pyro, a micro thumb-tip gesture recognition technique based on thermal infrared signals radiating from the fingers. Pyro uses a compact, low-power passive sensor, making it suitable for wearable and mobile applications. To demonstrate the feasibility of Pyro, we developed a self-contained prototype consisting of the infrared pyroelectric sensor, a custom sensing circuit, and software for signal processing and machine learning. A ten-participant user study yielded a 93.9% cross-validation accuracy and 84.9% leave-one-session-out accuracy on six thumb-tip gestures. Subsequent lab studies demonstrated Pyro’s robustness to varying light conditions, hand temperatures, and background motion. We conclude by discussing the insights we gained from this work and future research questions.

Read the full paper in ACM digital library:

J. Gong, Y. Zhang, X. Zhou, and X.-D. Yang, “Pyro: Thumb-Tip gesture recognition using pyroelectric infrared sensing,” in Proceedings of the Annual ACM Symposium on User Interface Software and Technology (UIST).    ACM Press, Oct. 2017, pp. 553-563.  Available: http://dx.doi.org/10.1145/3126594.3126615

Watch the video: