Abstract
This paper compares the three levels of data fusion with the goal of determining the optimal level of data fusion for multi-sensor human activity data. Using the data processing pipeline, gyroscope and accelerometer data was fused at the sensor-level, feature-level and decision-level. For each level of data fusion four different techniques were used with varying levels of success. This analysis was performed on four human activity publicly-available datasets along with four well-known machine learning classifiers to validate the results. The decision-level fusion (Acc = 0.7443±0.0850) outperformed the other two levels of fusion in regards to accuracy, sensor level (Acc = 0.5934 ± 0.1110) and feature level (Acc = 0.6742 ± 0.0053), but, the processing time and computational power required for training and classification were far greater than practical for a HAR system. However, Kalman filter appear to be the more efficient method, since it exhibited both good accuracy (Acc = 0.7536 ± 0.1566) and short processing time (time = 61.71ms ± 63.85); properties that play a large role in real-time applications using wearable devices. The results of this study also serve as baseline information in the HAR literature to compare future methods of data fusion.
| Original language | English |
|---|---|
| Pages (from-to) | 16979-16989 |
| Number of pages | 11 |
| Journal | IEEE Sensors Journal |
| Volume | 21 |
| Issue number | 15 |
| DOIs | |
| Publication status | Published - 1 Aug 2021 |
Fingerprint
Dive into the research topics of 'Human Activity Recognition with Accelerometer and Gyroscope: a Data Fusion Approach'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver