Abstract:
Human activity recognition (HAR) based on wearable sensors has developed as a new study topic in the domains of artificial intelligence and pattern recognition. HAR has a wide range of applications, including sports activity detection, smart homes, and health assistance, to name a few. Mobile device sensors such as accelerometers, gyroscopes, and magnetometers can generate time-series data for HAR. Computer Vision (CV) methods were previously utilised for HAR, which has a number of drawbacks, including mobility, ambient conditions, occlusion, higher cost, and, most importantly, privacy. Using sensor data instead of typical computer vision techniques has various advantages. Their work is believed to have overcome virtually all of the limitations of computer vision techniques. The use of Machine Learning (ML) and Deep Neural Networks (DNN) to recognise human activity from inertial sensor data is widely established in the literature. In this paper, we introduce HARResNeXT, a novel convolutional neural network inspired by ResNeXT. It classifies Human Activities based on inertial sensors data of smartphone. The presented model has been evaluated on a dataset by WISDM (Wireless Sensor Data Mining) Lab. We have achieved 97% Precision, Recall and F1-score. Moreover, the average accuracy achieved is 96.62%. Comparison with previous studies showed the presented model out-performed state-of-the-art. Index Terms-Activity Classification, Inertial Sensors based classification, Human Activity Recognition (HAR), ResNeXt, Ambient Assisted living.
Page(s):
1-1
DOI:
DOI not available
Published:
Journal: IEEE International Conference on Digital Futures and Transformative Technologies (ICoDT2) May 24-26, 2022 (Book of Abstracts), Volume: 1, Issue: 1, Year: 2022