The Auracle team presented an innovative new technique for interacting with wearable devices, during the UIST conference last week.
Abstract: We present Pyro, a micro thumb-tip gesture recognition technique based on thermal infrared signals radiating from the fingers. Pyro uses a compact, low-power passive sensor, making it suitable for wearable and mobile applications. To demonstrate the feasibility of Pyro, we developed a self-contained prototype consisting of the infrared pyroelectric sensor, a custom sensing circuit, and software for signal processing and machine learning. A ten-participant user study yielded a 93.9% cross-validation accuracy and 84.9% leave-one-session-out accuracy on six thumb-tip gestures. Subsequent lab studies demonstrated Pyro’s robustness to varying light conditions, hand temperatures, and background motion. We conclude by discussing the insights we gained from this work and future research questions.
J. Gong, Y. Zhang, X. Zhou, and X.-D. Yang, “Pyro: Thumb-Tip gesture recognition using pyroelectric infrared sensing,” in Proceedings of the Annual ACM Symposium on User Interface Software and Technology (UIST). ACM Press, Oct. 2017, pp. 553-563. Available: http://dx.doi.org/10.1145/3126594.3126615
A large subset of the Auracle team gathered in Hanover for a very productive full-day retreat, mapping out the coming year’s research agenda and making plans for new prototypes and papers. Great to have everyone (almost everyone) in one place!
David Kotz was recently quoted in a brief story about the Auracle project on the Engineering Innovation Podcast and Radio Series from the National Academy of Engineering, hosted by WTOP News. Listen on their website.
Abstract: Researchers strive to understand eating behavior as a means to develop diets and interventions that can help people achieve and maintain a healthy weight, recover from eating disorders, or manage their diet and nutrition for personal wellness. A major challenge for eating-behavior research is to understand when, where, what, and how people eat. In this paper, we evaluate sensors and algorithms designed to detect eating activities, more specifically, when people eat. We compare two popular methods for eating recognition (based on acoustic and electromyography (EMG) sensors) individually and combined. We built a data-acquisition system using two off-the-shelf sensors and conducted a study with 20 participants. Our preliminary results show that the system we implemented can detect eating with an accuracy exceeding 90.9% while the crunchiness level of food varies. We are developing a wearable system that can capture, process, and classify sensor data to detect eating in real-time.