Pyro: Thumb-Tip Gesture Recognition Using Pyroelectric Infrared Sensing

The Auracle team presented an innovative new technique for interacting with wearable devices, during the UIST conference last week.

uistf2235-file2Abstract: We present Pyro, a micro thumb-tip gesture recognition technique based on thermal infrared signals radiating from the fingers. Pyro uses a compact, low-power passive sensor, making it suitable for wearable and mobile applications. To demonstrate the feasibility of Pyro, we developed a self-contained prototype consisting of the infrared pyroelectric sensor, a custom sensing circuit, and software for signal processing and machine learning. A ten-participant user study yielded a 93.9% cross-validation accuracy and 84.9% leave-one-session-out accuracy on six thumb-tip gestures. Subsequent lab studies demonstrated Pyro’s robustness to varying light conditions, hand temperatures, and background motion. We conclude by discussing the insights we gained from this work and future research questions.

Read the full paper in ACM digital library:

J. Gong, Y. Zhang, X. Zhou, and X.-D. Yang, “Pyro: Thumb-Tip gesture recognition using pyroelectric infrared sensing,” in Proceedings of the Annual ACM Symposium on User Interface Software and Technology (UIST).    ACM Press, Oct. 2017, pp. 553-563.  Available: http://dx.doi.org/10.1145/3126594.3126615

WearSys paper and presentation

Today, Dartmouth graduate student Shengjie Bi presented an Auracle paper, Toward a Wearable Sensor for Eating Detection, at the ACM Workshop on Wearable Systems and Applications (WearSys) in Niagara Falls, NY.

Abstract: Researchers strive to understand eating behavior as a means to develop diets and interventions that can help people achieve and maintain a healthy weight, recover from eating disorders, or manage their diet and nutrition for personal wellness. A major challenge for eating-behavior research is to understand when, where, what, and how people eat. In this paper, we evaluate sensors and algorithms designed to detect eating activities, more specifically, when people eat. We compare two popular methods for eating recognition (based on acoustic and electromyography (EMG) sensors) individually and combined. We built a data-acquisition system using two off-the-shelf sensors and conducted a study with 20 participants. Our preliminary results show that the system we implemented can detect eating with an accuracy exceeding 90.9% while the crunchiness level of food varies. We are developing a wearable system that can capture, process, and classify sensor data to detect eating in real-time.