Optimizing Sensor Position with Virtual Sensors in Human Activity Recognition System Design
Abstract
:1. Introduction
- (1)
- The proposal of an optimizer to obtain the best sensor position during the HAR design process.
- (2)
- The introduction of a generic virtual humanoid avatar-enabled simulated sensor design.
- (3)
- An evaluation of a proposed optimization scheme with a real wearable case.
- (4)
- A three-case study of the virtual avatar combined with optimization of the activity recognition system design.
2. Related Work
2.1. HAR with Related Sensor Positioning
2.2. Synthetic Sensor Data Based on 3D Virtual Motion
2.3. Sensor’s Optimal Placement
3. Sensor Position Optimization in HAR Systems
3.1. Problem Description
3.2. Improved Discrete Cuckoo Searching (ID-CS) Algorithm
4. Virtual Sensor Design Based on Humanoid Avatar
4.1. Virtual Activity and Environment
4.2. Virtual Sensor Data Generation
4.3. Virtual Data Augmentation
- Permutation: Perturb the time location of input data. In a single data window, the data is divided into several segments and then randomly permuted.
- Time warping: Distort temporal locations. Similar as before, the several divided segments from the initial window are re-arranged by different time locations.
- Magnitude warping: Warp the signal’s magnitude. The magnitude of the data is convolved by a smooth curve varying around one.
- Change the size of the model: Multiply by a factor of 1.2 and 0.8 to alter body type to simulate different body shapes.
5. Experiment and Results
5.1. Introduction to Evaluation Process
- (a)
- Evaluation of the proposed optimization method.
- (b)
- Case study showing the sensor optimization method to help develop several types of HAR systems.
5.2. Study 1: Implementing the ID-CS on Real Wearable Acceleration Sensor Data
- (1)
- Objective: Find the optimal sensor combinations on the body;
- (2)
- Sensor number: 3;
- (3)
- Potential positions: 17 parts on the human body—head, chest, waist, right upper arm, right forearm, right hand, left upper arm, left forearm, left hand, right upper leg, right lower leg, right foot, left upper leg, left lower leg, left foot, left shoulder, and right shoulder;
- (4)
- Recognized activity: standing/walking/running/sit-to-stand/stand-to-sit/squat-to-stand/stand-to-squat/upstairs/downstairs;
- (5)
- Data: Acceleration data from 10 people (five females and five males; average age: 24). Subjects were asked to perform the above activity in 90 s with the Xsens device (60 Hz).
5.3. Study 2: Implementing the ID-CS on Virtual Sensor Data in HAR Design
- (a)
- Case 1: Wearable accelerometer HAR system
- (b)
- Case 2: Wearable distance sensor HAR system
- (c)
- Case 3: Nonwearable distance sensor HAR system
6. Discussion
6.1. Optimization of the Position Space
6.2. Sensor Numbers for HAR
6.3. Algorithm Performance and Comparison
6.4. Virtual Sensor Based HAR Design
7. Limitations and Future Work
7.1. HAR System Trained by Virtual sensory
7.2. Expanding on Virtual Sensor Modality
7.3. Multi-Objective Optimization within Wider Space
8. Conclusions
Author Contributions
Funding
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
HAR | Human activity recognition |
DCS | Discrete cuckoo searching |
DPSO | Discrete particle swarm optimization |
DFA | Discrete firefly algorithm |
DPSO | Discrete particle swarm optimization |
ID-CS | Improved discrete particle swarm optimization |
References
- Laput, G.; Harrison, C. Sensing fine-grained hand activity with smartwatches. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–13. [Google Scholar]
- Wu, J.; Harrison, C.; Bigham, J.P.; Laput, G. Automated Class Discovery and One-Shot Interactions for Acoustic Activity Recognition. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–14. [Google Scholar]
- Zhao, Y.; Wu, S.; Reynolds, L.; Azenkot, S. A face recognition application for people with visual impairments: Understanding use beyond the lab. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; pp. 1–14. [Google Scholar]
- Wu, C.J.; Houben, S.; Marquardt, N. Eaglesense: Tracking people and devices in interactive spaces using real-time top-view depth-sensing. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 3929–3942. [Google Scholar]
- Yuan, Y.; Kitani, K. 3d ego-pose estimation via imitation learning. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 735–750. [Google Scholar]
- Guan, S.; Wen, S.; Yang, D.; Ni, B.; Zhang, W.; Tang, J.; Yang, X. Human Action Transfer Based on 3D Model Reconstruction. In Proceedings of the AAAI Conference on Artificial Intelligence, Honolulu, HI, USA, 27 January–1 February 2019; 33, pp. 8352–8359. [Google Scholar]
- Gemperle, F.; Kasabach, C.; Stivoric, J.; Bauer, M.; Martin, R. Design for wearability. In Proceedings of the Digest of Papers, Second International Symposium on Wearable Computers (Cat. No. 98EX215), Pittsburgh, PA, USA, 19–20 October 1998; pp. 116–122. [Google Scholar]
- Kwon, H.; Tong, C.; Haresamudram, H.; Gao, Y.; Abowd, G.D.; Lane, N.D.; Ploetz, T. IMUTube: Automatic extraction of virtual on-body accelerometry from video for human activity recognition. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2020, 4, 1–29. [Google Scholar] [CrossRef]
- Young, A.D.; Ling, M.J.; Arvind, D.K. IMUSim: A simulation environment for inertial sensing algorithm design and evaluation. In Proceedings of the 10th ACM/IEEE International Conference on Information Processing in Sensor Networks, Chicago, IL, USA, 12–14 April 2011; pp. 199–210. [Google Scholar]
- Kang, C.; Jung, H.; Lee, Y. Towards Machine Learning with Zero Real-World Data. In Proceedings of the 5th ACM Workshop on Wearable Systems and Applications, Seoul, Korea, 21 June 2019; pp. 41–46. [Google Scholar]
- Takeda, S.; Okita, T.; Lago, P.; Inoue, S. A multi-sensor setting activity recognition simulation tool. In Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, Singapore, 8–12 October 2018; pp. 1444–1448. [Google Scholar]
- Zhang, S.; Alshurafa, N. Deep generative cross-modal on-body accelerometer data synthesis from videos. In Proceedings of the Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers, Virtual Event Mexico, 12–16 September 2020; pp. 223–227. [Google Scholar]
- Xia, C.; Sugiura, Y. From Virtual to Real World: Applying Animation to Design the Activity Recognition System. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems; ACM: New York, NY, USA, 2021; pp. 1–6. [Google Scholar]
- Xia, C.; Sugiura, Y. Wearable Accelerometer Optimal Positions for Human Motion Recognition. In Proceedings of the 2020 IEEE 2nd Global Conference on Life Sciences and Technologies (LifeTech), Kyoto, Japan, 10–12 March 2020; pp. 19–20. [Google Scholar]
- Kunze, K.; Lukowicz, P. Sensor placement variations in wearable activity recognition. IEEE Pervasive Comput. 2014, 13, 32–41. [Google Scholar] [CrossRef]
- Cleland, I.; Kikhia, B.; Nugent, C.; Boytsov, A.; Hallberg, J.; Synnes, K.; McClean, S.; Finlay, D. Optimal placement of accelerometers for the detection of everyday activities. Sensors 2013, 13, 9183–9200. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Jalal, A.; Kim, Y.H.; Kim, Y.J.; Kamal, S.; Kim, D. Robust human activity recognition from depth video using spatiotemporal multi-fused features. Pattern Recognit. 2017, 61, 295–308. [Google Scholar] [CrossRef]
- Subetha, T.; Chitrakala, S. A survey on human activity recognition from videos. In Proceedings of the 2016 International Conference on Information Communication and Embedded Systems (ICICES), Chennai, India, 25–26 February 2016; pp. 1–7. [Google Scholar]
- Cao, Z.; Simon, T.; Wei, S.E.; Sheikh, Y. Realtime multi-person 2d pose estimation using part affinity fields. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 7291–7299. [Google Scholar]
- Cho, S.G.; Yoshikawa, M.; Baba, K.; Ogawa, K.; Takamatsu, J.; Ogasawara, T. Hand motion recognition based on forearm deformation measured with a distance sensor array. In Proceedings of the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; pp. 4955–4958. [Google Scholar]
- Ohnishi, A.; Terada, T.; Tsukamoto, M. A Motion Recognition Method Using Foot Pressure Sensors. In Proceedings of the 9th Augmented Human International Conference, Seoul, Korea, 7–9 February 2018; pp. 1–8. [Google Scholar]
- Luo, X.; Guan, Q.; Tan, H.; Gao, L.; Wang, Z.; Luo, X. Simultaneous indoor tracking and activity recognition using pyroelectric infrared sensors. Sensors 2017, 17, 1738. [Google Scholar] [CrossRef] [PubMed]
- Woodstock, T.K.; Radke, R.J.; Sanderson, A.C. Sensor fusion for occupancy detection and activity recognition using time-of-flight sensors. In Proceedings of the 2016 19th International Conference on Information Fusion (FUSION), Heidelberg, Germany, 5–8 July 2016; pp. 1695–1701. [Google Scholar]
- Zhang, X.; Yao, L.; Zhang, D.; Wang, X.; Sheng, Q.Z.; Gu, T. Multi-person brain activity recognition via comprehensive EEG signal analysis. In Proceedings of the 14th EAI International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services, Melbourne, Australia, 7–10 November 2017; pp. 28–37. [Google Scholar]
- Venkatnarayan, R.H.; Shahzad, M. Gesture recognition using ambient light. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2018, 2, 1–28. [Google Scholar] [CrossRef]
- Tan, S.; Zhang, L.; Wang, Z.; Yang, J. MultiTrack: Multi-user tracking and activity recognition using commodity WiFi. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–12. [Google Scholar]
- Smith, M.; Moore, T.; Hill, C.; Noakes, C.; Hide, C. Simulation of GNSS/IMU measurements. In Proceedings of the ISPRS International Workshop, Working Group I/5: Theory, Technology and Realities of Inertial/GPS Sensor Orientation, Castelldefels, Spain, 22–23 September 2003; pp. 22–23. [Google Scholar]
- Pares, M.; Rosales, J.; Colomina, I. Yet another IMU simulator: Validation and applications. Proc. Eurocow Castelldefels Spain 2008, 30, 1–9. [Google Scholar]
- Derungs, A.; Amft, O. Estimating wearable motion sensor performance from personal biomechanical models and sensor data synthesis. Sci. Rep. 2020, 10, 11450. [Google Scholar] [CrossRef] [PubMed]
- Derungs, A.; Amft, O. Synthesising motion sensor data from biomechanical simulations to investigate motion sensor placement and orientation variations. In Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; pp. 6391–6394. [Google Scholar]
- Zampella, F.J.; Jiménez, A.R.; Seco, F.; Prieto, J.C.; Guevara, J.I. Simulation of foot-mounted IMU signals for the evaluation of PDR algorithms. In Proceedings of the 2011 International Conference on Indoor Positioning and Indoor Navigation, Guimaraes, Portugal, 21–23 September 2011; pp. 1–7. [Google Scholar]
- Ascher, C.; Kessler, C.; Maier, A.; Crocoll, P.; Trommer, G. New pedestrian trajectory simulator to study innovative yaw angle constraints. In Proceedings of the 23rd International Technical Meeting of The Satellite Division of the Institute of Navigation (ION GNSS 2010), Portland, OR, USA, 21 September 2010; pp. 504–510. [Google Scholar]
- Kwon, B.; Kim, J.; Lee, K.; Lee, Y.K.; Park, S.; Lee, S. Implementation of a virtual training simulator based on 360° multi-view human action recognition. IEEE Access 2017, 5, 12496–12511. [Google Scholar] [CrossRef]
- Liu, Y.; Zhang, S.; Gowda, M. When Video meets Inertial Sensors: Zero-shot Domain Adaptation for Finger Motion Analytics with Inertial Sensors. In Proceedings of the International Conference on Internet-of-Things Design and Implementation, Charlottesvle, VA, USA, 18–21 May 2021; pp. 182–194. [Google Scholar]
- Fortes Rey, V.; Garewal, K.K.; Lukowicz, P. Translating Videos into Synthetic Training Data for Wearable Sensor-Based Activity Recognition Systems Using Residual Deep Convolutional Networks. Appl. Sci. 2021, 11, 3094. [Google Scholar] [CrossRef]
- Alharbi, F.; Ouarbya, L.; Ward, J.A. Synthetic Sensor Data for Human Activity Recognition. In Proceedings of the 2020 International Joint Conference on Neural Networks (IJCNN), Glasgow, UK, 19–24 July 2020; pp. 1–9. [Google Scholar]
- Rey, V.F.; Hevesi, P.; Kovalenko, O.; Lukowicz, P. Let there be IMU data: Generating training data for wearable, motion sensor based activity recognition from monocular rgb videos. In Proceedings of the Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers; London, UK, 9–13 September 2019, pp. 699–708.
- Kim, S.; Lee, B.; Van Gemert, T.; Oulasvirta, A. Optimal Sensor Position for a Computer Mouse. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–13. [Google Scholar]
- Kunze, K.; Lukowicz, P. Dealing with sensor displacement in motion-based onbody activity recognition systems. In Proceedings of the 10th International Conference on Ubiquitous Computing, Seoul, Korea, 21–24 September 2008; pp. 20–29. [Google Scholar]
- Mallardo, V.; Aliabadi, M.; Khodaei, Z.S. Optimal sensor positioning for impact localization in smart composite panels. J. Intell. Mater. Syst. Struct. 2013, 24, 559–573. [Google Scholar] [CrossRef]
- Olguın, D.O.; Pentland, A.S. Human activity recognition: Accuracy across common locations for wearable sensors. In Proceedings of the 2006 10th IEEE international symposium on wearable computers, Montreux, Switzerland, 11–14 October 2006; pp. 11–14. [Google Scholar]
- Gjoreski, H.; Gams, M. Activity/Posture recognition using wearable sensors placed on different body locations. Proc. (738) Signal Image Process. Appl. Crete Greece 2011, 2224, 716724. [Google Scholar]
- Flynn, E.B.; Todd, M.D. A Bayesian approach to optimal sensor placement for structural health monitoring with application to active sensing. Mech. Syst. Signal Process. 2010, 24, 891–903. [Google Scholar] [CrossRef]
- Zhang, X.; Li, J.; Xing, J.; Wang, P.; Yang, Q.; Wang, R.; He, C. Optimal sensor placement for latticed shell structure based on an improved particle swarm optimization algorithm. Math. Probl. Eng. 2014, 2014, 743904. [Google Scholar] [CrossRef] [Green Version]
- Yang, X.S. Nature-Inspired Metaheuristic Algorithms; Luniver Press: Bristol, UK, 2010. [Google Scholar]
- Rodrigues, D.; Pereira, L.A.; Almeida, T.; Papa, J.P.; Souza, A.; Ramos, C.C.; Yang, X.S. BCS: A binary cuckoo search algorithm for feature selection. In Proceedings of the 2013 IEEE International Symposium on Circuits and Systems (ISCAS), Beijing, China, 19–23 May 2013; pp. 465–468. [Google Scholar]
- Zhou, X.; Liu, Y.; Li, B.; Li, H. A multiobjective discrete cuckoo search algorithm for community detection in dynamic networks. Soft Comput. 2017, 21, 6641–6652. [Google Scholar] [CrossRef]
- Ouaarab, A.; Ahiod, B.; Yang, X.S. Discrete cuckoo search algorithm for the travelling salesman problem. Neural Comput. Appl. 2014, 24, 1659–1669. [Google Scholar] [CrossRef]
- Um, T.T.; Pfister, F.M.; Pichler, D.; Endo, S.; Lang, M.; Hirche, S.; Fietzek, U.; Kulić, D. Data augmentation of wearable sensor data for parkinson’s disease monitoring using convolutional neural networks. In Proceedings of the 19th ACM International Conference on Multimodal Interaction, Glasgow, UK, 13–17 November 2017; pp. 216–220. [Google Scholar]
- Ronao, C.A.; Cho, S.B. Human activity recognition with smartphone sensors using deep learning neural networks. Expert Syst. Appl. 2016, 59, 235–244. [Google Scholar] [CrossRef]
- Yang, X.S. Firefly Algorithm, Levy Flights and Global Optimization; Springer: London, UK, 2010. [Google Scholar]
- Shi, Y. Particle swarm optimization: Developments, applications and resources. In Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No. 01TH8546), Seoul, Korea, 27–30 May 2001; 1, pp. 81–86. [Google Scholar]
- Rucco, R.; Sorriso, A.; Liparoti, M.; Ferraioli, G.; Sorrentino, P.; Ambrosanio, M.; Baselice, F. Type and location of wearable sensors for monitoring falls during static and dynamic tasks in healthy elderly: A review. Sensors 2018, 18, 1613. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Method | Dataset 1 (60 Hz) | Dataset 2 (20 Hz) | ||
---|---|---|---|---|
Result | Best Solution | Result | Best Solution | |
DPSO | Left shoulder, Left forearm, Right lower leg | 90.65% | Left shoulder, Left forearm, Right upper leg | 96.26% |
DFA | Right upper arm, Left shoulder, Right upper leg | 90.81% | Chest, Left forearm, Right upper leg | 96.51% |
DCS | Chest, Left upper arm, Left lower leg | 90.81% | Waist, Left shoulder, Left forearm | 96.25% |
ID-CS | Chest, Right upper arm, Right upper leg | 91.67% | Chest, Right shoulder, Left forearm | 97.07% |
Xia et. al [14] | Waist, Right shoulder, Right upper arm | 90.89% | Chest, Left forearm, Right upper leg | 96.51% |
Random selection | Chest, Left upper arm, Left lower leg | 90.81% | Right shoulder, Left shoulder, Left hand | 96.56% |
Empirical | Waist, Right upper leg, Right foot | 86.09% | Waist, Right upper leg, Right foot | 87.32% |
Dynamic programming | Chest, Right upper arm, Right upper leg | 91.67% | Waist, Right hand, Left hand | 97.26% |
Case | Recognized Activity | Method | Position | Computation Cost | Result |
---|---|---|---|---|---|
Case 1 | Standing/ Walking/ Running/ Upstairs/ Downstairs | ID-CS | Chest, Right shoulder, Head | 175 | 97.38% |
Dynamic programming | Chest, Right lower leg, Left lower leg | 455 | 97.62% | ||
Random selection | Left shoulder, Right upper arm, Right lower leg | 175 | 96.09% | ||
Empirical | Waist, Right upper leg, Head | 1 | 95.12% | ||
Case 2 | Heel up and down/ Squat/ Hip stretch | ID-CS | Left lower leg-left and Left upper leg-back | 40 | 100% |
Dynamic programming | Left lower leg-left and Left upper leg-back | 60 | 100% | ||
Random selection | Right upper leg-back and Right upper leg-front | 40 | 100% | ||
Empirical | Left lower leg-left and Left upper leg-right | 1 | 98.39% |
Number of Sub-Board | Method | Position (Location No.) | Computational Cost | Result |
---|---|---|---|---|
1 | Dynamic programming | [0, 1, 2, 3, 4] | 24 | 90.86% |
2 | ID-CS | [0, 1, 2, 3, 4] & [5, 6, 7, 8, 9] | 130 | 92.07% |
Random selection | [10, 11, 12, 13, 14]& [6, 16, 26, 36, 46, 56, 66] | 130 | 85.32% | |
Dynamic programming | [0, 1, 2, 3, 4] & [15, 16, 17, 18, 19] | 276 | 92.49% | |
3 | ID-CS | [0, 1, 2, 3, 4]&[5, 6, 7, 8, 9] &[3, 13, 23, 33, 43, 53, 63] | 400 | 95.23% |
Random selection | [6, 16, 26, 36, 46, 56, 66]& [0, 1, 2, 3, 4, 3]& [13, 23, 33, 43, 53, 63] | 400 | 86.29% | |
Dynamic programming | [0, 1, 2, 3, 4]&[15, 16, 17, 18, 19] &[3, 13, 23, 33, 43, 53, 63] | 2024 | 95.33% |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xia, C.; Sugiura, Y. Optimizing Sensor Position with Virtual Sensors in Human Activity Recognition System Design. Sensors 2021, 21, 6893. https://doi.org/10.3390/s21206893
Xia C, Sugiura Y. Optimizing Sensor Position with Virtual Sensors in Human Activity Recognition System Design. Sensors. 2021; 21(20):6893. https://doi.org/10.3390/s21206893
Chicago/Turabian StyleXia, Chengshuo, and Yuta Sugiura. 2021. "Optimizing Sensor Position with Virtual Sensors in Human Activity Recognition System Design" Sensors 21, no. 20: 6893. https://doi.org/10.3390/s21206893