Fu, Biying and Jarms, Lennart and Kirchbuchner, Florian and Kuijper, Arjan (2020) ExerTrack—Towards Smart Surfaces to Track Exercises. Technologies, 8 (1). p. 17. ISSN 2227-7080
technologies-08-00017-v2.pdf - Published Version
Download (14MB)
Abstract
The concept of the quantified self has gained popularity in recent years with the hype of miniaturized gadgets to monitor vital fitness levels. Smartwatches or smartphone apps and other fitness trackers are overwhelming the market. Most aerobic exercises such as walking, running, or cycling can be accurately recognized using wearable devices. However whole-body exercises such as push-ups, bridges, and sit-ups are performed on the ground and thus cannot be precisely recognized by wearing only one accelerometer. Thus, a floor-based approach is preferred for recognizing whole-body activities. Computer vision techniques on image data also report high recognition accuracy; however, the presence of a camera tends to raise privacy issues in public areas. Therefore, we focus on combining the advantages of ubiquitous proximity-sensing with non-optical sensors to preserve privacy in public areas and maintain low computation cost with a sparse sensor implementation. Our solution is the ExerTrack, an off-the-shelf sports mat equipped with eight sparsely distributed capacitive proximity sensors to recognize eight whole-body fitness exercises with a user-independent recognition accuracy of 93.5% and a user-dependent recognition accuracy of 95.1% based on a test study with 9 participants each performing 2 full sessions. We adopt a template-based approach to count repetitions and reach a user-independent counting accuracy of 93.6%. The final model can run on a Raspberry Pi 3 in real time. This work includes data-processing of our proposed system and model selection to improve the recognition accuracy and data augmentation technique to regularize the network.
Item Type: | Article |
---|---|
Subjects: | Institute Archives > Multidisciplinary |
Depositing User: | Managing Editor |
Date Deposited: | 31 Mar 2023 04:31 |
Last Modified: | 03 Feb 2024 04:02 |
URI: | http://eprint.subtopublish.com/id/eprint/1967 |