We are proud to announce another Auracle team member’s successful dissertation defense, and to share his doctoral thesis. Dr. Byron Lowens’ dissertation focuses on understanding how to develop privacy control mechanisms that provide adopters (and potential adopters) of wearables with integrated, in-the-moment control over personal information collected by wearables. Lowens describes the four different studies he conducted, on individual preferences on data sharing, the impact of the location of privacy control and decision timing, device-independent interactions to control data privacy, and on noticeability of identified interactions. His findings offer privacy researchers and designers of wearable technologies insight into the future development of wearables.
To learn more, check out Lowens’ dissertation below.
We are proud to announce Dr. Shengjie Bi’s successful dissertation defense and to share his doctoral thesis. Bi’s dissertation focuses on a generalizable approach to sensing eating-related behavior. Bi describes the creation of Auracle, a wearable earpiece that can automatically detect eating episodes, its adaptation to measure children’s eating behavior, and improvements in eating-activity detection algorithms. Bi also describes the development of a computer-vision approach for eating detection in free-living scenarios.
The Auracle device previously enabled us to automatically and unobtrusively recognize eating behavior in adults. The Auracle team recognized the need for adapting such technology to measure children’s eating behavior and to bolster research efforts focusing on adolescents’ eating behaviors.
We identified and addressed several challenges pertaining to monitoring eating behavior in children, paying particular attention to device fit and comfort. We also improved the accuracy and robustness of the eating-activity detection algorithms.
Check out the 4-minute video below to see graduate student Shengjie Bi’s presentation of our research at IEEE’s International Conference on Healthcare Informatics (ICHI). To read the paper, check out the link at the bottom of this post.
What if your tablecloth could recognize what is on the table and provide you with useful information? You’re running out of the house and your tablecloth, of all things, reminds you to take your sunglasses. When you come home, your tablecloth detects whether the plant on it needs to be watered and then later updates your diet tracking app when you pour yourself a glass of apple cider. This could be the future of Capacitivo.
Check out this video showcasing Capacitivo. Unlike prior work that has focused on metallic object recognition, our technique recognizes non-metallic objects such as food, different types of fruits, liquids, and other types of objects that are often found around a home or in a workplace.
Te-Yen Wu, Lu Tan, Yuji Zhang, Teddy Seyed, Xing-Dong Yang. Capacitivo: Contact-based Object Recognition on Interactive Fabrics Using Capacitive Sensing. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology (UIST). Oct. 2020, pp. 649–661. DOI: 10.1145/3379337.3415829
In April the Auracle team published a paper demonstrating it is possible to perform a real-time eating-detection algorithm on a low-power microcontroller.
Maria T. Nyamukuru and Kofi Odame. Tiny Eats : Eating Detection on a Microcontroller. In IEEE Workshop on Machine Learning on Edge in Sensor Systems (SenSys-ML), April 2020. Association for Computing Machinery. DOI: 10.1109/SenSysML50931.2020.00011
Abstract: There is a growing interest in low power highly efficient wearable devices for automatic dietary monitoring (ADM). The success of deep neural networks in audio event classification problems makes them ideal for this task. Deep neural networks are, however, not only computationally intensive and energy inefficient but also require a large amount of memory. To address these challenges, we propose a shallow gated recurrent unit (GRU) architecture suitable for resource-constrained applications. This paper describes the implementation of the Tiny Eats GRU, a shallow GRU neural network, on a low power microcontroller, Arm Cortex M0+, to classify eating episodes. Tiny Eats GRU is a hybrid of the traditional GRU and eGRU which makes it small and fast enough to fit on the Arm Cortex M0+ with comparable accuracy to the traditional GRU. The Tiny Eats GRU utilizes only 4% of the Arm Cortex M0+ memory and identifies eating or non-eating episodes with 6 ms latency and accuracy of 95.15%.
The Auracle team published a new paper at CHI’20, about interactive fabrics. Such interaction modes may be useful in future head-worn devices, including Auracle.
Te-Yen Wu, Shutong Qi, Junchi Chen, MuJie Shang, Jun Gong, Teddy Seyed, and Xing-Dong Yang. Fabriccio: Touchless Gestural Input on Interactive Fabrics. In Proceedings of the Conference on Human Factors in Computing Systems (CHI), April 2020. Association for Computing Machinery. DOI: 10.1145/3313831.3376681
Abstract: We present Tessutivo, a contact-based inductive sensing technique for contextual interactions on interactive fabrics. Our technique recognizes conductive objects (mainly metallic) that are commonly found in households and workplaces, such as keys, coins, and electronic devices. We built a prototype containing six by six spiral-shaped coils made of conductive thread, sewn onto a four-layer fabric structure. We carefully designed the coil shape parameters to maximize the sensitivity based on a new inductance approximation formula. Through a ten-participant study, we evaluated the performance of our proposed sensing technique across 27 common objects. We yielded 93.9% real-time accuracy for object recognition. We conclude by presenting several applications to demonstrate the unique interactions enabled by our technique.
David Kotz recently presented an invited webinar lecture that provides an overview of the Auracle and results of some of our experiments in validating Auracle.
David Kotz recently presented an invited webinar lecture in the Mobile Data to Knowledge (MD2K) program. The second half of that lecture (starting at 23:29) provides an overview of the Auracle and results of some of our experiments in validating Auracle. (The first half describes the Amulet project – also worth checking out!)
The Auracle team presented an innovative sensing technology for interacting with wearable devices, during the UIST 2018 in Berlin, Germany.
Abstract: We present Indutivo, a contact-based inductive sensing technique for contextual interactions. Our technique recognizes conductive objects (metallic primarily) that are commonly found in households and daily environments, as well as their individual movements when placed against the sensor. These movements include sliding, hinging, and rotation. We describe our sensing principle and how we designed the size, shape, and layout of our sensor coils to optimize sensitivity, sensing range, recognition and tracking accuracy. Through several studies, we also demonstrated the performance of our proposed sensing technique in environments with varying levels of noise and interference conditions. We conclude by presenting demo applications on a smartwatch, as well as insights and lessons we learned from our experience.
Read the full paper in ACM digital library:
Jun Gong, Xin Yang, Teddy Seyed, Josh Urban Davis, and Xing-Dong Yang. 2018. Indutivo: Contact-Based, Object-Driven Interactions with Inductive Sensing. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology (UIST ’18). ACM, pp.321-333. DOI: https://doi.org/10.1145/3242587.3242662