Tiny Eats

In April the Auracle team published a paper demonstrating it is possible to perform a real-time eating-detection algorithm on a low-power microcontroller.

Maria T. Nyamukuru and Kofi Odame. Tiny Eats : Eating Detection on a Microcontroller. In IEEE Workshop on Machine Learning on Edge in Sensor Systems (SenSys-ML), April 2020. Association for Computing Machinery. DOI: 10.1109/SenSysML50931.2020.00011

Abstract: There is a growing interest in low power highly efficient wearable devices for automatic dietary monitoring (ADM). The success of deep neural networks in audio event classification problems makes them ideal for this task. Deep neural networks are, however, not only computationally intensive and energy inefficient but also require a large amount of memory. To address these challenges, we propose a shallow gated recurrent unit (GRU) architecture suitable for resource-constrained applications. This paper describes the implementation of the Tiny Eats GRU, a shallow GRU neural network, on a low power microcontroller, Arm Cortex M0+, to classify eating episodes. Tiny Eats GRU is a hybrid of the traditional GRU and eGRU which makes it small and fast enough to fit on the Arm Cortex M0+ with comparable accuracy to the traditional GRU. The Tiny Eats GRU utilizes only 4% of the Arm Cortex M0+ memory and identifies eating or non-eating episodes with 6 ms latency and accuracy of 95.15%.

Fabriccio: Touchless Gestural Input on Interactive Fabrics

The Auracle team published a new paper at CHI’20, about interactive fabrics.  Such interaction modes may be useful in future head-worn devices, including Auracle.

Te-Yen Wu, Shutong Qi, Junchi Chen, MuJie Shang, Jun Gong, Teddy Seyed, and Xing-Dong Yang. Fabriccio: Touchless Gestural Input on Interactive Fabrics. In Proceedings of the Conference on Human Factors in Computing Systems (CHI), April 2020. Association for Computing Machinery. DOI: 10.1145/3313831.3376681

Tessutivo: Contextual Interactions on Interactive Fabrics with Inductive Sensing

The Auracle team recently presented its latest work at UIST.

The Auracle team recently presented its latest work at UIST, the ACM Symposium on User Interface Software and Technology.75224807_10158861719604027_4236863062864297984_n.jpg

Abstract: We present Tessutivo, a contact-based inductive sensing technique for contextual interactions on interactive fabrics. Our technique recognizes conductive objects (mainly metallic) that are commonly found in households and workplaces, such as keys, coins, and electronic devices. We built a prototype containing six by six spiral-shaped coils made of conductive thread, sewn onto a four-layer fabric structure. We carefully designed the coil shape parameters to maximize the sensitivity based on a new inductance approximation formula. Through a ten-participant study, we evaluated the performance of our proposed sensing technique across 27 common objects. We yielded 93.9% real-time accuracy for object recognition. We conclude by presenting several applications to demonstrate the unique interactions enabled by our technique.

DOI: 10.1145/3332165.3347897

Auracle overview (video)

David Kotz recently presented an invited webinar lecture that provides an overview of the Auracle and results of some of our experiments in validating Auracle.

David Kotz recently presented an invited webinar lecture in the Mobile Data to Knowledge (MD2K) program.  The second half of that lecture (starting at 23:29) provides an overview of the Auracle and results of some of our experiments in validating Auracle.  (The first half describes the Amulet project – also worth checking out!)

Indutivo, at UIST’18

The Auracle team presented an innovative sensing technology for interacting with wearable devices, during the UIST 2018 in Berlin, Germany.

indutivoAbstract: We present Indutivo, a contact-based inductive sensing technique for contextual interactions. Our technique recognizes conductive objects (metallic primarily) that are commonly found in households and daily environments, as well as their individual movements when placed against the sensor. These movements include sliding, hinging, and rotation. We describe our sensing principle and how we designed the size, shape, and layout of our sensor coils to optimize sensitivity, sensing range, recognition and tracking accuracy. Through several studies, we also demonstrated the performance of our proposed sensing technique in environments with varying levels of noise and interference conditions. We conclude by presenting demo applications on a smartwatch, as well as insights and lessons we learned from our experience.

Read the full paper in ACM digital library:

Jun Gong, Xin Yang, Teddy Seyed, Josh Urban Davis, and Xing-Dong Yang. 2018. Indutivo: Contact-Based, Object-Driven Interactions with Inductive Sensing. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology (UIST ’18). ACM, pp.321-333. DOI: https://doi.org/10.1145/3242587.3242662

Watch the presentation:

Ubicomp’18: Detecting Eating Episodes with an Ear-Mounted Sensor

Paper to appear in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT) (Ubicomp’18).

Abstract: In this paper, we propose Auracle, a wearable earpiece that can automatically recognize eating behavior. More specifically, in free-living conditions, we can recognize when and for how long a person is eating. Using an off-the-shelf contact microphone placed behind the ear, Auracle captures the sound of a person chewing as it passes through the bone and tissue of the head. This audio data is then processed by a custom analog/digital circuit board. To ensure reliable (yet comfortable) contact between microphone and skin, all hardware components are incorporated into a 3D-printed behind-the-head framework. We collected field data with 14 participants for 32 hours in free-living conditions and additional eating data with 10 participants for 2 hours in a laboratory setting. We achieved accuracy exceeding 92.8% and F1 score exceeding 77.5% for eating detection. Moreover, Auracle successfully detected 20-24 eating episodes (depending on the metrics) out of 26 in free-living conditions. We demonstrate that our custom device could sense, process, and classify audio data in real time. Additionally, we estimate Auracle can last 28.1 hours with a 110 mAh battery while communicating its observations of eating behavior to a smartphone over Bluetooth.

PDF (7046K)

bi-ubicomp18-fig5.png

Shengjie Bi, Tao Wang, Nicole Tobias, Josephine Nordrum, Shang Wang, George Halvorsen, Sougata Sen, Ronald Peterson, Kofi Odame, Kelly Caine, Ryan Halter, Jacob Sorber, and David Kotz. Auracle: Detecting Eating Episodes with an Ear-Mounted Sensor. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT) (Ubicomp), 2(3), September 2018. DOI 10.1145/3264902.

Pyro: Thumb-Tip Gesture Recognition Using Pyroelectric Infrared Sensing

The Auracle team presented an innovative new technique for interacting with wearable devices, during the UIST conference last week.

uistf2235-file2Abstract: We present Pyro, a micro thumb-tip gesture recognition technique based on thermal infrared signals radiating from the fingers. Pyro uses a compact, low-power passive sensor, making it suitable for wearable and mobile applications. To demonstrate the feasibility of Pyro, we developed a self-contained prototype consisting of the infrared pyroelectric sensor, a custom sensing circuit, and software for signal processing and machine learning. A ten-participant user study yielded a 93.9% cross-validation accuracy and 84.9% leave-one-session-out accuracy on six thumb-tip gestures. Subsequent lab studies demonstrated Pyro’s robustness to varying light conditions, hand temperatures, and background motion. We conclude by discussing the insights we gained from this work and future research questions.

Read the full paper in ACM digital library:

J. Gong, Y. Zhang, X. Zhou, and X.-D. Yang, “Pyro: Thumb-Tip gesture recognition using pyroelectric infrared sensing,” in Proceedings of the Annual ACM Symposium on User Interface Software and Technology (UIST).    ACM Press, Oct. 2017, pp. 553-563.  Available: http://dx.doi.org/10.1145/3126594.3126615

Watch the video:

Auracle team retreat

A large subset of the Auracle team gathered in Hanover for a very productive full-day retreat, mapping out the coming year’s research agenda and making plans for new prototypes and papers.  Great to have everyone (almost everyone) in one place!

Auracle team (partial) in October 2017.
Most of the Auracle team; L to R: David Kotz, Blake Thrower, Byron Lowens, Josie Nordrum, Liam Feeney, Kofi Odame, Nicole Tobias, Ron Peterson, Shengjie Bi, Jacob Sorber, Peter Wang, Ryan Halter. Absent: Kelly Caine, Jung Gong, Robert Halvorsen, XD Yang.