全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

A Novel Method for Cross-Subject Human Activity Recognition with Wearable Sensors

DOI: 10.4236/jst.2024.142002, PP. 17-34

Keywords: Human Activity Recognition, Cross-Subject Adaptation, Semi-Supervised Learning, Wearable Sensors

Full-Text   Cite this paper   Add to My Lib

Abstract:

Human Activity Recognition (HAR) is an important way for lower limb exoskeleton robots to implement human-computer collaboration with users. Most of the existing methods in this field focus on a simple scenario recognizing activities for specific users, which does not consider the individual differences among users and cannot adapt to new users. In order to improve the generalization ability of HAR model, this paper proposes a novel method that combines the theories in transfer learning and active learning to mitigate the cross-subject issue, so that it can enable lower limb exoskeleton robots being used in more complex scenarios. First, a neural network based on convolutional neural networks (CNN) is designed, which can extract temporal and spatial features from sensor signals collected from different parts of human body. It can recognize human activities with high accuracy after trained by labeled data. Second, in order to improve the cross-subject adaptation ability of the pre-trained model, we design a cross-subject HAR algorithm based on sparse interrogation and label propagation. Through leave-one-subject-out validation on two widely-used public datasets with existing methods, our method achieves average accuracies of 91.77% on DSAD and 80.97% on PAMAP2, respectively. The experimental results demonstrate the potential of implementing cross-subject HAR for lower limb exoskeleton robots.

References

[1]  Hayashi, T., Kawamoto, H. and Sankai, Y. (2005) Control Method of Robot Suit HAL Working as Operator’s Muscle Using Biological and Dynamical Information. 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, 2-6 August 2005, 3063-3068.
https://doi.org/10.1109/iros.2005.1545505
[2]  Dollar, A.M. and Herr, H. (2008) Design of a Quasi-Passive Knee Exoskeleton to Assist Running. 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, 22-26 September 2008, 747-754.
https://doi.org/10.1109/iros.2008.4651202
[3]  Wang, S., Wang, L., Meijneke, C., van Asseldonk, E., Hoellinger, T., Cheron, G., et al. (2015) Design and Control of the MINDWALKER Exoskeleton. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 23, 277-286.
https://doi.org/10.1109/tnsre.2014.2365697
[4]  Krut, S., Benoit, M., Dombre, E. and Pierrot, F. (2010). Moonwalker, a Lower Limb Exoskeleton Able to Sustain Bodyweight Using a Passive Force Balancer. 2010 IEEE International Conference on Robotics and Automation, Anchorage, 3-7 May 2010, 2215-2220.
https://doi.org/10.1109/robot.2010.5509961
[5]  Ergasheva, B.I. (2017) Lower Limb Exoskeletons: Brief Review. Scientific and Technical Journal of Information Technologies, Mechanics and Optics, 17, 1153-1158.
https://doi.org/10.17586/2226-1494-2017-17-6-1153-1158
[6]  Foerster, F. and Smeja, M. (1999) Joint Amplitude and Frequency Analysis of Tremor Activity. Electromyography and Clinical Neurophysiology, 39, 11-19.
[7]  Batchuluun, G., Kim, J.H., Hong, H.G., Kang, J.K. and Park, K.R. (2017) Fuzzy System Based Human Behavior Recognition by Combining Behavior Prediction and Recognition. Expert Systems with Applications, 81, 108-133.
https://doi.org/10.1016/j.eswa.2017.03.052
[8]  Ji, X., Cheng, J., Feng, W. and Tao, D. (2018) Skeleton Embedded Motion Body Partition for Human Action Recognition Using Depth Sequences. Signal Processing, 143, 56-68.
https://doi.org/10.1016/j.sigpro.2017.08.016
[9]  Jalal, A., Kim, Y., Kim, Y., Kamal, S. and Kim, D. (2017) Robust Human Activity Recognition from Depth Video Using Spatiotemporal Multi-Fused Features. Pattern Recognition, 61, 295-308.
https://doi.org/10.1016/j.patcog.2016.08.003
[10]  Xu, C., Govindarajan, L.N. and Cheng, L. (2017) Hand Action Detection from Ego-Centric Depth Sequences with Error-Correcting Hough Transform. Pattern Recognition, 72, 494-503.
https://doi.org/10.1016/j.patcog.2017.08.009
[11]  Oyedotun, O.K. and Khashman, A. (2016) Deep Learning in Vision-Based Static Hand Gesture Recognition. Neural Computing and Applications, 28, 3941-3951.
https://doi.org/10.1007/s00521-016-2294-8
[12]  Pigou, L., van den Oord, A., Dieleman, S., Van Herreweghe, M. and Dambre, J. (2016) Beyond Temporal Pooling: Recurrence and Temporal Convolutions for Gesture Recognition in Video. International Journal of Computer Vision, 126, 430-439.
https://doi.org/10.1007/s11263-016-0957-7
[13]  Qi, J., Yang, P., Hanneghan, M., Tang, S. and Zhou, B. (2019) A Hybrid Hierarchical Framework for Gym Physical Activity Recognition and Measurement Using Wearable Sensors. IEEE Internet of Things Journal, 6, 1384-1393.
https://doi.org/10.1109/jiot.2018.2846359
[14]  Aviles-Cruz, C., Rodriguez-Martinez, E., Villegas-Cortez, J. and Ferreyra-Ramirez, A. (2019) Granger-Causality: An Efficient Single User Movement Recognition Using a Smartphone Accelerometer Sensor. Pattern Recognition Letters, 125, 576-583.
https://doi.org/10.1016/j.patrec.2019.06.029
[15]  Wang, Y., Cang, S. and Yu, H. (2019) A Survey on Wearable Sensor Modality Centred Human Activity Recognition in Health Care. Expert Systems with Applications, 137, 167-190.
https://doi.org/10.1016/j.eswa.2019.04.057
[16]  Dang, L.M., Piran, M.J., Han, D., Min, K. and Moon, H. (2019) A Survey on Internet of Things and Cloud Computing for Healthcare. Electronics, 8, Article 768.
https://doi.org/10.3390/electronics8070768
[17]  Hegde, N., Bries, M., Swibas, T., Melanson, E. and Sazonov, E. (2018) Automatic Recognition of Activities of Daily Living Utilizing Insole-Based and Wrist-Worn Wearable Sensors. IEEE Journal of Biomedical and Health Informatics, 22, 979-988.
https://doi.org/10.1109/jbhi.2017.2734803
[18]  Chung, S., Lim, J., Noh, K.J., Gue Kim, G. and Jeong, H.T. (2018) Sensor Positioning and Data Acquisition for Activity Recognition Using Deep Learning. 2018 International Conference on Information and Communication Technology Convergence, Jeju, 17-19 October 2018, 154-159.
https://doi.org/10.1109/ictc.2018.8539473
[19]  Laput, G., Xiao, R. and Harrison, C. (2016) ViBand: High-Fidelity Bio-Acoustic Sensing Using Commodity Smartwatch Accelerometers. Proceedings of the 29th Annual Symposium on User Interface Software and Technology, Toko, 16-19 October, 321-333.
https://doi.org/10.1145/2984511.2984582
[20]  Pham, C., Diep, N.N. and Phuong, T.M. (2017) E-Shoes: Smart Shoes for Unobtrusive Human Activity Recognition. 2017 9th International Conference on Knowledge and Systems Engineering (KSE), Hue, 19-21 October 2017, 269-274.
https://doi.org/10.1109/kse.2017.8119470
[21]  Wang, A., Chen, G., Yang, J., Zhao, S. and Chang, C. (2016) A Comparative Study on Human Activity Recognition Using Inertial Sensors in a Smartphone. IEEE Sensors Journal, 16, 4566-4578.
https://doi.org/10.1109/jsen.2016.2545708
[22]  Zhou, X., Liang, W., Wang, K.I., Wang, H., Yang, L.T. and Jin, Q. (2020) Deep-Learning-Enhanced Human Activity Recognition for Internet of Healthcare Things. IEEE Internet of Things Journal, 7, 6429-6438.
https://doi.org/10.1109/jiot.2020.2985082
[23]  Li, J., Tian, L., Wang, H., An, Y., Wang, K. and Yu, L. (2019) Segmentation and Recognition of Basic and Transitional Activities for Continuous Physical Human Activity. IEEE Access, 7, 42565-42576.
https://doi.org/10.1109/access.2019.2905575
[24]  Chen, Y. and Shen, C. (2017) Performance Analysis of Smartphone-Sensor Behavior for Human Activity Recognition. IEEE Access, 5, 3095-3110.
https://doi.org/10.1109/access.2017.2676168
[25]  Lawal, I.A. and Bano, S. (2019) Deep Human Activity Recognition Using Wearable Sensors. Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments, Rhodes, 5-7 June 2019, 45-48.
https://doi.org/10.1145/3316782.3321538
[26]  Kongsil, K., Suksawatchon, J. and Suksawatchon, U. (2019) Physical Activity Recognition Using Streaming Data from Wrist-Worn Sensors. 2019 4th International Conference on Information Technology, Bangkok, 24-25 October 2019, 274-279.
https://doi.org/10.1109/incit.2019.8912130
[27]  Gholamiangonabadi, D., Kiselov, N. and Grolinger, K. (2020) Deep Neural Networks for Human Activity Recognition with Wearable Sensors: Leave-One-Subject-Out Cross-Validation for Model Selection. IEEE Access, 8, 133982-133994.
https://doi.org/10.1109/access.2020.3010715
[28]  Leite, C.F.S. and Xiao, Y. (2020) Improving Cross-Subject Activity Recognition via Adversarial Learning. IEEE Access, 8, 90542-90554.
https://doi.org/10.1109/access.2020.2993818
[29]  Ye, Y., Zhou, Q., Pan, T., Huang, Z. and Wan, Z. (2021) Alleviating Feature Confusion in Cross-Subject Human Activity Recognition via Adversarial Domain Adaptation Strategy. 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society, Mexico, 1-5 November 2021, 7586-7589.
https://doi.org/10.1109/embc46164.2021.9630655
[30]  Soleimani, E. and Nazerfard, E. (2021) Cross-Subject Transfer Learning in Human Activity Recognition Systems Using Generative Adversarial Networks. Neurocomputing, 426, 26-34.
https://doi.org/10.1016/j.neucom.2020.10.056
[31]  Lu, W., Chen, Y., Wang, J. and Qin, X. (2021) Cross-Domain Activity Recognition via Substructural Optimal Transport. Neurocomputing, 454, 65-75.
https://doi.org/10.1016/j.neucom.2021.04.124
[32]  Zhao, J., Deng, F., He, H. and Chen, J. (2021) Local Domain Adaptation for Cross-Domain Activity Recognition. IEEE Transactions on Human-Machine Systems, 51, 12-21.
https://doi.org/10.1109/thms.2020.3039196
[33]  Kongsil, K., Suksawatchon, J. and Suksawatchon, U. (2020) Wrist-Worn Physical Activity Recognition: A Fusion Learning Approach. 2020 5th International Conference on Information Technology, Chonburi, 21-22 October 2020, 116-121.
https://doi.org/10.1109/incit50588.2020.9310980
[34]  Suh, S., Rey, V. and Lukowicz, P. (2021) Adversarial Deep Feature Extraction Network for User Independent Human Activity Recognition.
[35]  Kumar, P. and Suresh, S. (2023) Deeptranshar: A Novel Clustering-Based Transfer Learning Approach for Recognizing the Cross-Domain Human Activities Using Grus (Gated Recurrent Units) Networks. Internet of Things, 21, Article 100681.
https://doi.org/10.1016/j.iot.2023.100681
[36]  Lin, C. and Marculescu, R. (2020) Model Personalization for Human Activity Recognition. 2020 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), Austin, 23-27 March 2020, 1-7.
https://doi.org/10.1109/percomworkshops48775.2020.9156229
[37]  Cruciani, F., Nugent, C.D., Quero, J.M., Cleland, I., Mccullagh, P., Synnes, K., et al. (2020) Personalizing Activity Recognition with a Clustering Based Semi-Population Approach. IEEE Access, 8, 207794-207804.
https://doi.org/10.1109/access.2020.3038084
[38]  Soleimani, E., Khodabandelou, G., Chibani, A. and Amirat, Y. (2022) Generic Semi-Supervised Adversarial Subject Translation for Sensor-Based Activity Recognition. Neurocomputing, 500, 649-661.
https://doi.org/10.1016/j.neucom.2022.05.075
[39]  Xu, Q., Wei, X., Bai, R., Li, S. and Meng, Z. (2023) Integration of Deep Adaptation Transfer Learning and Online Sequential Extreme Learning Machine for Cross-Person and Cross-Position Activity Recognition. Expert Systems with Applications, 212, Article 118807.
https://doi.org/10.1016/j.eswa.2022.118807
[40]  Zeng, M., Yu, T., Wang, X., Nguyen, L.T., Mengshoel, O.J. and Lane, I. (2017) Semi-supervised Convolutional Neural Networks for Human Activity Recognition. 2017 IEEE International Conference on Big Data, Boston, 11-14 December 2017, 522-529.
https://doi.org/10.1109/bigdata.2017.8257967
[41]  Bettini, C., Civitarese, G. and Presotto, R. (2021) Personalized Semi-Supervised Federated Learning for Human Activity Recognition.
[42]  Liu, W., Fu, S., Zhou, Y., Zha, Z. and Nie, L. (2021) Human Activity Recognition by Manifold Regularization Based Dynamic Graph Convolutional Networks. Neurocomputing, 444, 217-225.
https://doi.org/10.1016/j.neucom.2019.12.150
[43]  Chen, K., Yao, L., Zhang, D., Wang, X., Chang, X. and Nie, F. (2020) A Semi-Supervised Recurrent Convolutional Attention Model for Human Activity Recognition. IEEE Transactions on Neural Networks and Learning Systems, 31, 1747-1756.
https://doi.org/10.1109/tnnls.2019.2927224
[44]  Narasimman, G., Lu, K., Raja, A., Foo, C., Aly, M., Jiang, L. and Chandrasekhar, V. (2021) A*har: A New Benchmark towards Semi-Supervised Learning for Class-Imbalanced Human Activity Recognition.
[45]  Altun, K., Barshan, B. and Tunçel, O. (2010) Comparative Study on Classifying Human Activities with Miniature Inertial and Magnetic Sensors. Pattern Recognition, 43, 3605-3620.
https://doi.org/10.1016/j.patcog.2010.04.019
[46]  Reiss, A. and Stricker, D. (2012) Introducing a New Benchmarked Dataset for Activity Monitoring. 2012 16th International Symposium on Wearable Computers, Newcastle, 18-22 June 2012, 108-109.
https://doi.org/10.1109/iswc.2012.13
[47]  Zhang, K., Wang, J., de Silva, C.W. and Fu, C. (2020) Unsupervised Cross-Subject Adaptation for Predicting Human Locomotion Intent. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 28, 646-657.
https://doi.org/10.1109/tnsre.2020.2966749

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133