Skip to main content

Exploring Human Activities Using eSense Earable Device

  • Chapter
  • First Online:
Activity and Behavior Computing

Part of the book series: Smart Innovation, Systems and Technologies ((SIST,volume 204))

Abstract

Detecting head- and mouth-related human activities of elderly people are very important for nurse care centers. They need to track different types of activities of elderly people like swallowing, eating, etc., to measure the health status of elderly people. In this regard, earable devices open up interesting possibilities for monitoring personal-scale behavioral activities. Here, we introduce activity recognition based on an earable device called ‘eSense’. It has multiple sensors that can be used for human activity recognition. ‘eSense’ has a 6-axis inertial measurement unit with a microphone and Bluetooth. In this paper, we propose an activity recognition framework using eSense device. We collect accelerometer and gyroscope sensor data from eSense device to detect head- and mouth-related activities along with other normal human activities. We evaluated the classification performance of the classifier using both accelerometer and gyroscope data. For this work, we develop a smartphone application for data collection from the eSense. Several statistical features are exploited to recognize head- and mouth-related activities (e.g., head nodding, head shaking, eating, and speaking), and regular activities (e.g., stay, walk, and speaking while walking). We explored different types of machine learning approaches like Convolutional Neural Network (CNN), Random Forest (RnF), K-Nearest Neighbor (KNN), Linear Discriminant Analysis (LDA), Support Vector Machine (SVM), etc., for classifying activities. We have achieved satisfactory results. Our results show that using both accelerometer and gyroscope sensors can improve performance. We achieve accuracy of 80.45% by LDA, 93.34% by SVM, 91.92% by RnF, 91.64% by KNN, and 93.76% by CNN while we exploit both accelerometer and gyroscope sensor data together. The results demonstrate the prospect of eSense device for detecting human activities in various healthcare monitoring system.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Antar, A.D., Ahmed, M., Ahad, M.A.R.: Challenges in sensor-based human activity recognition and a comparative analysis of benchmark datasets: a review. In: International Conference on Activity and Behavior Computing (ABC) (2019)

    Google Scholar 

  2. Ahad, M.A.R.: Motion history images for action recognition and understanding. Springer (2013). ISBN: 978-1-4471-4730-5

    Google Scholar 

  3. Abbate, S., Avvenuti, M., Corsini, P., Light, J., Vecchio, A.: Monitoring of human movements for fall detection and activities recognition in elderly care using wireless sensor network: a survey. In: INTECH (2015)

    Google Scholar 

  4. Ahad, M.A.R.: Computer vision and action recognition: a guide for image processing and computer vision community for action understanding. Springer (2011). ISBN: 978-94-91216-20-6

    Google Scholar 

  5. Wang, J., Chen, Y., Hao, S., Peng, X., Hu, L.: Deep learning for sensor-based activity recognition: a survey. Pattern Recognit. Lett. (2018)

    Google Scholar 

  6. Hajihashemi, Z., Popescu, M.: Detection of abnormal sensor patterns in eldercare. In: 4th IEEE International Conference on E-Health and Bioengineering (2013)

    Google Scholar 

  7. Lara, Oscar, D., Miguel, A.: A mobile platform for real-time human activity recognition. In: IEEE Consumer Communications and Networking Conference (CCNC) (2012)

    Google Scholar 

  8. Mairittha, N., Mairittha, T., Inoue, S.: FonLog - a mobile app for nursing activity recognition. In: ACM International Conference on Pervasive and Ubiquitous Computing (Ubicomp) Demo (2018)

    Google Scholar 

  9. Inoue, S., Mairittha, T., Fah, N., Hossain, T.: Integrating activity recognition and nursing care records: the system, experiment, and the dataset. In: International Conference on Activity and Behavior Computing (2019)

    Google Scholar 

  10. Kawsar, F., Min, C., Mathur, A., Montanari, A.: Earables for personal-scale behavior analytics. IEEE Pervasive Comput. 17 (2018)

    Google Scholar 

  11. Hossain, T., Islam, M.S., Ahad, M.A.R., Inoue, S.: Human activity recognition using earable device. In: ACM International Joint Conference on Pervasive and Ubiquitous Computing and the International Symposium on Wearable Computers (UbiComp/ISWC ’19 Adjunct) (2019)

    Google Scholar 

  12. Ignatov, A.: Real-time human activity recognition from accelerometer data using convolutional neural networks. Appl. Soft Comput. (2018)

    Google Scholar 

  13. Inoue, M., Inoue, S., Nishida, T.: Deep recurrent neural network for mobile human activity recognition with high throughput (2016). arXiv:1611.03607

  14. Islam, M.S., Okita, T., Inoue, S.: Evaluation of transfer learning for human activity recognition among different datasets. In: 2019 IEEE International Conference on Pervasive Intelligence and Computing (PiCom), Fukuoka, Japan, 5 August 2019, 6 pp. (2019)

    Google Scholar 

  15. Ogbuabor, G., La, R.: Human activity recognition for healthcare using smartphones. In: International Conference on Machine Learning and Computing (2018)

    Google Scholar 

  16. Nirjon, S., et al.: MusicalHeart: a hearty way of listening to music. In: Proceedings of 10th ACM Conference on Embedded Network Sensor Systems (SenSys 12), pp. 43–56 (2012)

    Google Scholar 

  17. LeBoeuf, S.F., et al.: Earbud-based sensor for the assessment of energy expenditure, heart rate, and VO2max. Med. Sci. Sports Exerc. 46(5), 1046–1052 (2014)

    Article  Google Scholar 

  18. Taniguchi, K., et al.: Earable RCC: development of an earphone-type reliable chewing-count measurement device. J. Healthcare Eng. (2018). https://doi.org/10.1155/2018/6161525

  19. Gil, B., Anastasova, S., Yang, G.Z.: A smart wireless ear-worn device for cardiovascular and sweat parameter monitoring during physical exercise: design and performance results. Sensors 19(7), 1616 (2019)

    Google Scholar 

  20. Min, C., Mathur, A., Kawsar, F.: Exploring audio and kinetic sensing on earable devices. In: Proceedings of the 4th ACM workshop on wearable systems and applications, WearSys ’18, New York, NY, USA, pp. 5–10 (2018)

    Google Scholar 

  21. Atallah, L., Wiik, A.V., Jones, G.G., Lo, B., Cobb, J.P., Amis, A., Yang, G.: “Validation of an ear-worn sensor for gait monitoring using a force-plate instrumented treadmill. Gait Posture 35(4), 674–676 (2012)

    Google Scholar 

  22. Milosevic, B., Farella, E.: Wearable inertial sensor for jump performance analysis. In: Proceedings of the 2015 workshop on wearable systems and applications (WearSys ’15), pp. 15–20. ACM, New York. https://doi.org/10.1145/2753509.2753512

  23. Bedri, A., Li, R., Haynes, M., Kosaraju, R.P., Grover, I., Prioleau, T., Beh, M.Y., Goel, M., Starner, T., Abowd, G.: EarBit: Using wearable sensors to detect eating episodes in unconstrained environments. In: Proceedings of ACM Interactive Mob, Wearable Ubiquitous Technologies (2017)

    Google Scholar 

  24. Olubanjo, T., Ghovanloo, M.: Real-time swallowing detection based on tracheal acoustics. In: IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (2014)

    Google Scholar 

  25. Amft, O., Stager, M., Lukowicz, P., Troster, G.: Analysis of chewing sounds for dietary monitoring. In: Proceedings of the 7th international conference on ubiquitous computing (UbiComp’05) (2005)

    Google Scholar 

  26. Dargie, W.: Analysis of time and frequency domain features of accelerometer measurements. In: 18th International Conference on Computer Communications and Networks (2009)

    Google Scholar 

  27. Preece, S.J., Goulermas, J.Y., Kenney, P.J., Howard, D.: A comparison of feature extraction methods for the classification of dynamic activities from accelerometer data. IEEE Trans. Biomed. Eng. (2009)

    Google Scholar 

  28. Kawaguchi, N., Ogawa, N., Iwasaki, Y., Kaji, K., Terada, T., Murao, K., Inoue, S., Kawahara, Y., Sumi, Y., Nishio, N.: HASC challenge: gathering large scale human activity corpus for the real-world activity understandings”. In: Proceedings of the 2nd Augmented Human International Conference. ACM (2011)

    Google Scholar 

  29. Sztyler, T., Stuckenschmidt, H., Petrich, W.: Position-aware activity recognition with wearable devices. Pervasive Mob. Comput. 38(2) (2017)

    Google Scholar 

  30. Gao, L., Bourke, A., Nelson, J.: Evaluation of accelerometer based multi-sensor versus single-sensor activity recognition systems. Med. Eng. Phys. (2014)

    Google Scholar 

  31. Hossain, T., Tazin, T., Ahad, M.A.R., Inoue, S.: Activity recognition by using LoRaWAN sensor. In: ACM International Conference on Pervasive and Ubiquitous Computing (Ubicomp) (2018)

    Google Scholar 

  32. Röddiger, T., Wolffram, D., Laubenstein, D., Budde, M., Beigl, M.: Towards respiration rate monitoring using an in-ear headphone inertial measurement unit. In: 1st International Workshop on Earable Computing, London, UK, 10 September 2019 (2019)

    Google Scholar 

  33. Ferlini, A., Montanari, A., Mascolo, C., Harle, R.: Head motion tracking through in-ear wearables. In: 1st International Workshop on Earable Computing, London, UK, 10 September 2019 (2019)

    Google Scholar 

  34. Radhakrishnan, M., Misra, A.: Can earables support effective user engagement during weight-based gym exercises? In: 1st International Workshop on Earable Computing, London, UK, 10 September 2019 (2019)

    Google Scholar 

  35. Prakash, J., Yang, Z., Wei, Y., Choudhury, R.R.: EStep: earabales as opportunity for physio-analytics. In: 1st International Workshop on Earable Computing, London, UK, 10 September 2019 (2019)

    Google Scholar 

  36. Rupavatharam, S., Gruteser, M.: Towards in-ear inertial jaw clenching detection. In: 1st International Workshop on Earable Computing, London, UK, 10 September 2019 (2019)

    Google Scholar 

  37. Matsumura, K., Okada, K.: eSense veers: a case study of acoustical manipulation in walking without sight both on subtle and overt conditions. In: 1st International Workshop on Earable Computing, London, UK, 10 September 2019 (2019)

    Google Scholar 

  38. Hölzemann, A., Odoemelemand, H., Laerhoven, K.V.: Using an in-ear device to annotate activity data across multiple wearable sensors. In: 1st International Workshop on Earable Computing, London, UK, 10 September 2019 (2019)

    Google Scholar 

  39. Bardram, J.E.: The CAMS eSense framework – enabling earable computing for mHealth apps and digital phenotyping. In: 1st International Workshop on Earable Computing, London, UK, 10 September 2019 (2019)

    Google Scholar 

  40. Odoemelem, H., Hölzemann, A., Laerhoven, K.V.: Using the eSense wearable earbud as a light-weight robot arm controller. In: 1st International workshop on earable computing, London, UK, 10 September 2019 (2019)

    Google Scholar 

  41. Ahad, M.A.R.: Vision and sensor based human activity recognition: challenges ahead. In: Advancements in Instrumentation and Control in Applied System Applications. IGI Global (2020)

    Google Scholar 

  42. Ahad, M.A.R., Antar, A.D., Shahid, O.: Vision-based action understanding for assistive healthcare: a short review. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, USA, pp. 1–11 (2019)

    Google Scholar 

Download references

Acknowledgements

We acknowledge Pervasive Research Unit Nokia Bell Labs, Cambridge and Prof. Fahim Kawsar for providing the devices.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Md Shafiqul Islam .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Islam, M.S., Hossain, T., Ahad, M.A.R., Inoue, S. (2021). Exploring Human Activities Using eSense Earable Device. In: Ahad, M.A.R., Inoue, S., Roggen, D., Fujinami, K. (eds) Activity and Behavior Computing. Smart Innovation, Systems and Technologies, vol 204. Springer, Singapore. https://doi.org/10.1007/978-981-15-8944-7_11

Download citation

Publish with us

Policies and ethics