In April the Auracle team published a paper demonstrating it is possible to perform a real-time eating-detection algorithm on a low-power microcontroller.
Maria T. Nyamukuru and Kofi Odame. Tiny Eats : Eating Detection on a Microcontroller. In IEEE Workshop on Machine Learning on Edge in Sensor Systems (SenSys-ML), April 2020. Association for Computing Machinery. DOI: 10.1109/SenSysML50931.2020.00011
Abstract: There is a growing interest in low power highly efficient wearable devices for automatic dietary monitoring (ADM). The success of deep neural networks in audio event classification problems makes them ideal for this task. Deep neural networks are, however, not only computationally intensive and energy inefficient but also require a large amount of memory. To address these challenges, we propose a shallow gated recurrent unit (GRU) architecture suitable for resource-constrained applications. This paper describes the implementation of the Tiny Eats GRU, a shallow GRU neural network, on a low power microcontroller, Arm Cortex M0+, to classify eating episodes. Tiny Eats GRU is a hybrid of the traditional GRU and eGRU which makes it small and fast enough to fit on the Arm Cortex M0+ with comparable accuracy to the traditional GRU. The Tiny Eats GRU utilizes only 4% of the Arm Cortex M0+ memory and identifies eating or non-eating episodes with 6 ms latency and accuracy of 95.15%.
thanks for sharing
LikeLike