Hybrid Optimized GRU-ECNN Models for Gait Recognition with Wearable IOT Devices

Comput Intell Neurosci. 2022 May 13;2022:5422428. doi: 10.1155/2022/5422428. eCollection 2022.

ABSTRACT

With the advent of the Internet of Things (IoT), human-assistive technologies in healthcare services have reached the peak of their application in terms of diagnosis and treatment process. These devices must be aware of human movements to provide better aid in clinical applications as well as the user’s daily activities. In this context, real-time gait analysis remains to be key catalyst for developing intelligent assistive devices. In addition to machine and deep learning algorithms, gait recognition systems have significantly improved in terms of high accuracy recognition. However, most of the existing models are focused on improving gait recognition while ignoring the computational overhead that affects the accuracy of detection and even remains unsuitable for real-time implementation. In this research paper, we proposed a hybrid gated recurrent unit (GRU) based on BAT-inspired extreme convolutional networks (BAT-ECN) for the effective recognition of human activities using gait data. The gait data are collected by implanting the wearable Internet of Things (WIoT) devices invasively. Then, a novel GRU and ECN networks are employed to extract the spatio-temporal features which are then used for classification to realize gait recognition. Extensive and comprehensive experimentations have been carried out to evaluate the proposed model using real-time datasets and also other benchmarks such as whuGait and OU-ISIR datasets. To prove the excellence of the proposed learning model, we have compared the model’s performance with the other existing hybrid models. Results demonstrate that the proposed model has outperformed the other learning models in terms of high gait classification and less computational overhead.

PMID:35602639 | PMC:PMC9122681 | DOI:10.1155/2022/5422428

Share:

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *