Abstract
In recent years, various methods for the automated recognition of human activities based on the analysis of sensor data using machine learning approaches have been researched. The continuously increasing hardware performance of mobile devices such as smartphones and wearables and the ongoing development of existing algorithms have resulted in steadily higher recognition rates. These latest advances resulted in an growing demand for intelligent and context sensitive mobile applications. However, the generation of valid ground truth information with a suitable quality remains a major challenge. In addition, there is currently no standardized procedure for generating an activity classifier for the use in custom application areas. The holistic workflow introduced in this paper focuses on the recognition of activities using mobile devices. For this purpose, the generation of a ground truth information by recording and annotating sensor data is described. The generated data set is used for transfer learning in a machine learning framework and the resulting model is the basis for a mobile real-time classification application. The data source is a current study of the University of Applied Sciences Mittweida.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Akhavian, R., Behzadan, A.H.: Wearable sensor-based activity recognition for data-driven simulation of construction workers’ activities. In: Winter Simulation Conference, pp. 3333–3344 (2015)
Apple: Deployment to Core ML GitBook (2018). https://apple.github.io/turicreate/docs/userguide. Accessed 15 Jan 2019
Apple: Machine Learning - Apple Developer (2018). https://developer.apple.com/machine-learning. Accessed 15 Jan 2019
Apple: Turi Create - User Guide (2018). https://apple.github.io/turicreate/docs/ userguide/. Accessed 21 Jan 2019
Apple: Wearing your Apple Watch - Apple Support (2018). https://support.apple.com/en-us/HT204665. Accessed 28 Jan 2019
Cardoso, N., Madureira, J., Pereira, N.: Smartphone-based transport mode detection for elderly care. In: HealthCom, pp. 1–6 (2016)
Chen, Y., Shen, C.: Performance analysis of smartphone-sensor behavior for human activity recognition. IEEE Access 5, 3095–3110 (2017)
Dev, A.: Core Motion Framework - Apple Developer (2018). https://developer.apple.com/documentation/coremotion. Accessed 21 Aug 2018
Direito, A., Jiang, Y., Whittaker, R., Maddison, R.: Smartphone apps to improve fitness and increase physical activity among young people: protocol of the Apps for IMproving FITness (AIMFIT) randomized controlled trial. BMC Public Health 15(1), 635 (2015)
Ertel, W.: Grundkurs Künstliche Intelligenz. Springer Fachmedien Wiesbaden, Wiesbaden (2016). https://doi.org/10.1007/978-3-658-13549-2
Henpraserttae, A., Thiemjarus, S., Marukatat, S.: Accurate activity recognition using a mobile phone regardless of device orientation and location. In: BSN, pp. 41–46 (2011)
Hitachi: DFKI and Hitachi jointly develop AI technology for human activity recognition of workers using wearable devices (2017). http://www.hitachi.com/New/cnews/month/2017/03/170308.html. Accessed 13 Sept 2018
Jalal, A., Kim, Y., Kim, Y.J., Kamal, S., Kim, D.: Robust human activity recognition from depth video using spatiotemporal multi-fused features. Pattern Recogn. 61, 295–308 (2017)
Li, K.: Awesome CoreML Models - GitHub (2018). https://github.com/likedan/Awesome-CoreML-Models. Accessed 18 Jan 2019
Moser, L.E., Melliar-Smith, P.M.: Personal health monitoring using a smartphone. In: 2015 IEEE International Conference on Mobile Services (MS), pp. 344–351. IEEE (2015)
Newnham, J.: Machine Learning with Core ML: An iOS Developer’s Guide to Implementing Machine Learning in Mobile Apps. Packt Publishing, Birmingham (2018)
Ordóñez, F., Roggen, D.: Deep convolutional and LSTM recurrent neural networks for multimodal wearable activity recognition. Sensors 16(1), 115 (2016)
Raschka, S.: Python Machine Learning. Packt Publishing Ltd., Birmingham (2015)
Ward, J.A., Lukowicz, P., Troster, G., Starner, T.E.: Activity recognition of assembly tasks using body-worn microphones and accelerometers. IEEE Trans. Pattern Anal. Mach. Intell. 28(10), 1553–1567 (2006)
Yang, A.Y., Iyengar, S., Kuryloski, P., Jafari, R.: Distributed segmentation and classification of human actions using a wearable motion sensor network. In: 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPR Workshops), pp. 1–8. IEEE (2008)
Acknowledgements
This work was written in the junior research group “Agile Publika” funded by the European Social Fund (ESF) an the Free State of Saxony.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Rolletschke, T., Roschke, C., Thomanek, R., Platte, B., Manthey, R., Zimmer, F. (2019). Generation of Individual Activity Classifiers for the Use in Mobile Context-Aware Applications. In: Stephanidis, C. (eds) HCI International 2019 - Posters. HCII 2019. Communications in Computer and Information Science, vol 1033. Springer, Cham. https://doi.org/10.1007/978-3-030-23528-4_42
Download citation
DOI: https://doi.org/10.1007/978-3-030-23528-4_42
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-23527-7
Online ISBN: 978-3-030-23528-4
eBook Packages: Computer ScienceComputer Science (R0)