Advertisement

Kinect sensor-based interaction monitoring system using the BLSTM neural network in healthcare

  • Rajkumar Saini
  • Pradeep Kumar
  • Barjinder Kaur
  • Partha Pratim Roy
  • Debi Prosad Dogra
  • K. C. SantoshEmail author
Original Article
  • 42 Downloads

Abstract

Remote monitoring of patients is considered as one of the reliable alternatives to healthcare solutions for elderly and/or chronically ill patients. Further, monitoring interaction with people plays an important role in diagnosis and in managing patients that are suffering from mental illnesses, such as depression and autism spectrum disorders (ASD). In this paper, we propose the Kinect sensor-based interaction monitoring system between two persons using the Bidirectional long short-term memory neural network (BLSTM-NN). Such model can be adopted for the rehabilitation of people (who may be suffering from ASD and other psychological disorders) by analyzing their activities. Medical professionals and caregivers for diagnosing and remotely monitoring the patients suffering from such psychological disorders can use the system. In our study, ten volunteers were involved to create five interactive groups to perform continuous activities, where the Kinect sensor was used to record data. A set of continuous activities was created using random combinations of 24 isolated activities. 3D skeleton of each user was detected and tracked using the Kinect and modeled using BLSTM-NN. We have used a lexicon by analyzing the constraints while performing continuous activities to improve the performance of the system. We have achieved the maximum accuracy of 70.72%. Our results outperformed the previously reported results and therefore the proposed system can further be used in developing internet of things (IoT) Kinect sensor-based healthcare application.

Keywords

Activity recognition Depth sensors Bidirectional long short-term memory neural network Healthcare Autism spectrum disorders Internet of things 

Notes

Compliance with ethical standards

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institution.

References

  1. 1.
    A P Association et al (1994) Diagnostic and statistical manual of mental disorders (dsm). American psychiatric association, Washington, DC, pp 143–147Google Scholar
  2. 2.
    A P Association et al (2000) Diagnostic and statistical manual of mental disorders, revised, vol 943. American Psychiatric Association Washington DC, p 2000Google Scholar
  3. 3.
    Atzori L, Iera A, Morabito G (2010) The internet of things: a survey. Comput Netw 54(15):2787–2805CrossRefGoogle Scholar
  4. 4.
    Baccouche M, Mamalet F, Wolf C, Garcia C, Baskurt A (2011) Sequential deep learning for human action recognition. In: Human Behavior Understanding, pp 29–39Google Scholar
  5. 5.
    Bloom V, Makris D, Argyriou V (2012) G3d: a gaming action dataset and real time action recognition evaluation framework. In: Conference on computer vision and pattern recognition workshops, pp 7–12Google Scholar
  6. 6.
    Chen M, Zhang Y, Li Y, Hassan MM, Alamri A (2015) Aiwac: affective interaction through wearable computing and cloud technology. IEEE Wirel Commun 22(1):20–27CrossRefGoogle Scholar
  7. 7.
    Cheok MJ, Omar Z, Jaward MH (2017) A review of hand gesture and sign language recognition techniques. Int J Mach Learn Cybern.  https://doi.org/10.1007/s13042-017-0705-5 CrossRefGoogle Scholar
  8. 8.
    Dell’Acqua P, Klompstra LV, Jaarsma T, Samini A (2013) An assistive tool for monitoring physical activities in older adults. In: 2nd international conference on serious games and applications for health, pp 1–6Google Scholar
  9. 9.
    Feng S, Murray-Smith R, Ramsay A (2017) Position stabilisation and lag reduction with gaussian processes in sensor fusion system for user performance improvement. Int J Mach Learn Cybern 8(4):1167–1184CrossRefGoogle Scholar
  10. 10.
    Gaglio S, Re GL, Morana M (2015) Human activity recognition process using 3-d posture data. IEEE Trans Hum Mach Syst 45(5):586–597CrossRefGoogle Scholar
  11. 11.
    Garcia JA, Pisan Y, Tan CT, Navarro KF (2014) Assessing the kinects capabilities to perform a time-based clinical test for fall risk assessment in older people. In: International conference on entertainment computing, pp 100–107Google Scholar
  12. 12.
    Ghose A, Sinha P, Bhaumik C, Sinha A, Agrawal A, Dutta Choudhury A (2013) Ubiheld: ubiquitous healthcare monitoring system for elderly and chronic patients. In: Conference on pervasive and ubiquitous computing adjunct publication, pp 1255–1264Google Scholar
  13. 13.
    Graves A, Liwicki M, Fernández S, Bertolami R, Bunke H, Schmidhuber J (2009) A novel connectionist system for unconstrained handwriting recognition. IEEE Trans Pattern Anal Mach Intell 31(5):855–868CrossRefGoogle Scholar
  14. 14.
    Graves A, Schmidhuber J (2005) Framewise phoneme classification with bidirectional LSTM and other neural network architectures. Neural Netw 18(5):602–610CrossRefGoogle Scholar
  15. 15.
    Holzinger A, Röcker C, Ziefle M (2015) Smart health: open problems and future challenges, vol 8700. Springer, SwitzerlandGoogle Scholar
  16. 16.
    Ji J, Scholten P, Zhao Q (2014) Support to self-diagnosis with awareness. Int J Mach Learn Cybern 5(4):647–658CrossRefGoogle Scholar
  17. 17.
    Kulkarni K, Evangelidis G, Cech J, Horaud R (2015) Continuous action recognition based on sequence alignment. Int J Comput Vis 112(1):90–114CrossRefGoogle Scholar
  18. 18.
    Kumar P, Gauba H, Roy PP, Dogra DP (2016) Coupled hmm-based multi-sensor data fusion for sign language recognition. Pattern Recognit Lett 86:1–8.  https://doi.org/10.1016/j.patrec.2016.12.004 CrossRefGoogle Scholar
  19. 19.
    Kumar P, Saini R, Roy P, Dogra D (2017) A bio-signal based framework to secure mobile devices. J Netw Comput Appl 89:62–71CrossRefGoogle Scholar
  20. 20.
    Kumar P, Saini R, Roy PP, Dogra DP (2016) 3d text segmentation and recognition using leap motion. In: Multimedia tools and applications, pp 1–20CrossRefGoogle Scholar
  21. 21.
    Lanata A, Valenza G, Nardelli M, Gentili C, Scilingo EP (2015) Complexity index from a personalized wearable monitoring system for assessing remission in mental health. IEEE J Biomed Health Inf 19(1):132–139CrossRefGoogle Scholar
  22. 22.
    Lefebvre G, Berlemont S, Mamalet F, Garcia C (2013) Blstm-rnn based 3d gesture classification. In: International conference on artificial neural networks, pp 381–388Google Scholar
  23. 23.
    Miranda JC, Sousa AA, Fernandes T, Orvalho VC (2011) Interactive technology: teaching people with autism to recognize facial emotions. In: Autism spectrum disorders—from genes to environment. InTechGoogle Scholar
  24. 24.
    Mukherjee S, Saini R, Kumar P, Roy PP, Dogra DP, Kim BG (2017) Fight detection in hockey videos using deep network. J Multimed Inf Syst 4(4):225–232Google Scholar
  25. 25.
    Mukhopadhyay SC (2015) Wearable sensors for human activity monitoring: a review. IEEE Sens J 15(3):1321–1330CrossRefGoogle Scholar
  26. 26.
    Murali S, Rincon F, Atienza D (2015) A wearable device for physical and emotional health monitoring. In:Computing in Cardiology Conference, pages 121–124Google Scholar
  27. 27.
    Parajuli M, TranD, Ma W, Sharma D (2012) Senior health monitoring using kinect. In: 4th international conference on communications and electronics, pp 309–312Google Scholar
  28. 28.
    Rabiner L, Juang B (1986) An introduction to hidden markov models. IEEE ASSP Mag 3(1):4–16CrossRefGoogle Scholar
  29. 29.
    Saini R, Kumar P, Roy PP, Dogra DP (2018) A novel framework of continuous human-activity recognition using kinect. Neurocomputing 311:99–111CrossRefGoogle Scholar
  30. 30.
    Sebestyen G, Hangan A, Oniga S, Gál Z (2014) ehealth solutions in the context of internet of things. In: International conference automation, quality and testing, robotics, pp 261–267Google Scholar
  31. 31.
    Sempena S, Maulidevi NU, Aryan PR (2011) Human action recognition using dynamic time warping. In: International conference on electrical engineering and informatics, pp 1–5Google Scholar
  32. 32.
    Sung J, Ponce C, Selman B, Saxena A (2012) Unstructured human activity detection from rgbd images. In: 2012 IEEE international conference on robotics and automation, pp 842–849Google Scholar
  33. 33.
    Wang J, Liu Z, Wu Y, Yuan J (2012) Mining actionlet ensemble for action recognition with depth cameras. In: Conference on computer vision and pattern recognition, pp 1290–1297Google Scholar
  34. 34.
    Wang L (2016) Recognition of human activities using continuous autoencoders with wearable sensors. Sensors 16(2):189CrossRefGoogle Scholar
  35. 35.
    Ward JA, Lukowicz P, Troster G, Starner TE (2006) Activity recognition of assembly tasks using body-worn microphones and accelerometers. IEEE Trans Pattern Anal Mach Intell 28(10):1553–1567CrossRefGoogle Scholar
  36. 36.
    Yun K, Honorio J, Chattopadhyay D, Berg TL, Samaras D(2012) Two-person interaction detection using body-pose features and multiple instance learning. In: Conference on computer vision and pattern recognition workshops, pp 28–35Google Scholar
  37. 37.
    Zhang Z (2012) Microsoft kinect sensor and its effect. IEEE Multimed 19(2):4–10CrossRefGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Department of Computer Science and EngineeringIIT RoorkeeRoorkeeIndia
  2. 2.Department of Computer Science and EngineeringDCRUSTSonepatIndia
  3. 3.School of Electrical SciencesIIT BhubaneshwarBhubaneshwarIndia
  4. 4.Department of Computer ScienceUniversity of South DakotaVermillionUSA

Personalised recommendations