Erschienen in:
06.05.2022 | Brief Communication
Validation of human activity recognition using a convolutional neural network on accelerometer and gyroscope data
verfasst von:
Eni Hysenllari, Jörg Ottenbacher, Darren McLennan
Erschienen in:
German Journal of Exercise and Sport Research
|
Ausgabe 2/2022
Einloggen, um Zugang zu erhalten
Abstract
Background
Human activity recognition (HAR) means identifying sequences of data recorded by specialized wearable sensors into known, well-defined classes of physical activity. In principle, activity recognition provides great societal benefits, especially in real-life, humancentric applications such as healthcare and care of the elderly. Using raw acceleration and angular velocity to train a convolutional neural network shows great success in recognition accuracy. This article presents the quality of activity recognition obtained using convolutional neural network on acceleration and angular velocity data recorded from different sensor locations.
Methods
Thirty-five volunteers from two studies (16 women and 19 men) with an average age of 28.54 years wore Move4/EcgMove4 accelerometers on 6 different body positions (ankle, thigh, hip, wrist, upper arm, chest) while completing typical activities (sitting, standing, lying, walking, jogging, cycling). We then used those databases to evaluate a two-dimensional convolutional neural network (2D-CNN) that takes 3D acceleration and 3D angular velocity signals as inputs to recognize human activity. We measure the networks performance using accuracy and Cohen’s κ.
Results
Depending on the location of the sensor, the accuracy of the network varies from 96.57% (ankle) to 99.28% (thigh) and Cohen’s κ varies from 0.96 (ankle) to 0.99 (thigh).
Conclusions
The performance of the 2D-CNN concerning human activity recognition showed excellent results. Using raw signals may enable real-time, on-device—also known as at the edge—activity recognition even in small devices with low computational power and small storage.