ST-DeepHAR: Deep Learning Model for Human Activity Recognition in IoHT Applications

Mohamed Abdel-Basset, Hossam Hawash, Ripon K. Chakrabortty, Michael Ryan, Mohamed Elhoseny, Houbing Song

Research output: Contribution to journalArticlepeer-review

85 Citations (Scopus)

Abstract

Human activity recognition (HAR) has been regarded as an indispensable part of many smart home systems and smart healthcare applications. Specifically, HAR is of great importance in the Internet of Healthcare Things (IoHT), owing to the rapid proliferation of Internet of Things (IoT) technologies embedded in various smart appliances and wearable devices (such as smartphones and smartwatches) that have a pervasive impact on an individual's life. The inertial sensors of smartphones generate massive amounts of multidimensional time-series data, which can be exploited effectively for HAR purposes. Unlike traditional approaches, deep learning techniques are the most suitable choice for such multivariate streams. In this study, we introduce a supervised dual-channel model that comprises long short-Term memory (LSTM), followed by an attention mechanism for the temporal fusion of inertial sensor data concurrent with a convolutional residual network for the spatial fusion of sensor data. We also introduce an adaptive channel-squeezing operation to fine-Tune convolutional a neural network feature extraction capability by exploiting multichannel dependency. Finally, two widely available and public HAR data sets are used in experiments to evaluate the performance of our model. The results demonstrate that our proposed approach can overcome state-of-The-Art methods.

Original languageEnglish
Article number9238036
Pages (from-to)4969-4979
Number of pages11
JournalIEEE Internet of Things Journal
Volume8
Issue number6
DOIs
Publication statusPublished - 15 Mar 2021
Externally publishedYes

Fingerprint

Dive into the research topics of 'ST-DeepHAR: Deep Learning Model for Human Activity Recognition in IoHT Applications'. Together they form a unique fingerprint.

Cite this