Transition activity recognition using fuzzy logic and overlapped sliding window-based convolutional neural networks

  • Jaewoong Kang
  • Jongmo Kim
  • Seongil Lee
  • Mye SohnEmail author


In this paper, we propose a novel approach that can recognize transition activities (e.g., turn to left or right, stand up, and travel down the stairs). Unlike simple activities, the transition activities have unique characteristics that change continuously and occur instantaneously. To recognize the transition activities with these characteristics, we applied convolutional neural network (CNN) that is widely adopted to recognize images, voices, and human activities. In addition, to generate input instances for the CNN model, we developed the overlapped sliding window method, which can accurately recognize the transition activities occurring during a short time. To increase the accuracy of the activity recognition, we have learned CNN models by separating the simple activity and the transition activity. Finally, we adopt fuzzy logic that can be used to handle ambiguous activities. All the procedures of recognizing the elderly’s activities are performed using the data collected by the six sensors embedded in the smartphone. The effectiveness of the proposed approach is shown through experiments. We demonstrate that our approach can improve recognition accuracy of transition activities.


Human activity recognition Transition activity Convolutional neural network (CNN) Overlapped sliding window Fuzzy logic 



This research was partially supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science and Technology (NRF-2016 R1D1A1B03932110) and partially supported by the IT R&D program of KEIT (No. 1005-0810, Development of Disability Independent Accessibility Enhancement Technology for Input and Abnormality of Home Appliances).


  1. 1.
    Aggarwal JK, Ryoo MS (2011) Human activity analysis: a review. ACM Comput Surv (CSUR) 43(3):16CrossRefGoogle Scholar
  2. 2.
    Anguita D, Ghio A, Oneto L, Parra X, Reyes-Ortiz JL (2012) Human activity recognition on smartphones using a multiclass hardware-friendly support vector machine. In: International Workshop on Ambient Assisted Living, pp 216–223Google Scholar
  3. 3.
    Ann OC, Theng LB (2014) Human activity recognition: a review. In: 2014 IEEE International Conference on Control System, Computing and Engineering (ICCSCE), pp 389–393Google Scholar
  4. 4.
    Attal F, Mohammed S, Dedabrishvili M, Chamroukhi F, Oukhellou L, Amirat Y (2015) Physical human activity recognition using wearable sensors. Sensors 15(12):31314–31338CrossRefGoogle Scholar
  5. 5.
    Baccouche M, Mamalet F, Wolf C, Garcia C, Baskurt A (2011) Sequential deep learning for human action recognition. In: International Workshop on Human Behavior Understanding, pp 29–39Google Scholar
  6. 6.
    Bayat A, Pomplun M, Tran DA (2014) A study on human activity recognition using accelerometer data from smartphones. Proc Comput Sci 34:450–457CrossRefGoogle Scholar
  7. 7.
    Dernbach S, Das B, Krishnan NC, Thomas BL, Cook DJ (2012) Simple and complex activity recognition through smart phones. In: 2012 8th International Conference on Intelligent Environments (IE), pp 214–221Google Scholar
  8. 8.
    Dietterich TG (2002) Machine learning for sequential data: a review. In: Joint IAPR International Workshops on Statistical Techniques in Pattern Recognition (SPR) and Structural and Syntactic Pattern Recognition (SSPR), pp 15–30Google Scholar
  9. 9.
    Fong S, Liu K, Cho K, Wong R, Mohammed S, Fiaidhi J (2016) Improvised methods for tackling big data stream mining challenges: case study of human activity recognition. J Supercomput 72(10):3927–3959CrossRefGoogle Scholar
  10. 10.
    Ghosh A, Riccardi G (2014) Recognizing human activities from smartphone sensor signals. In: Proceedings of the 22nd ACM International Conference on Multimedia, pp 865–868Google Scholar
  11. 11.
    Goffe WL, Ferrier GD, Rogers J (1994) Global optimization of statistical functions with simulated annealing. J Econom 60(1–2):65–99CrossRefzbMATHGoogle Scholar
  12. 12.
    Guan D, Yuan W, Lee YK, Gavrilov A, Lee S (2007) Activity recognition based on semi-supervised learning. In: 13th IEEE International Conference on Embedded and Real-Time Computing Systems and Applications, 2007. RTCSA 2007, pp 469–475Google Scholar
  13. 13.
    Hammerla NY, Halloran S, Ploetz T (2016) Deep, convolutional, and recurrent models for human activity recognition using wearables. arXiv:1604.08880
  14. 14.
    Huynh, T, Schiele, B (2005) Analyzing features for activity recognition. In: Proceedings of the 2005 Joint Conference on Smart Objects and Ambient Intelligence: Innovative Context-Aware Services: Usages and Technologies, pp 159–163Google Scholar
  15. 15.
    Jiang W, Yin Z (2015) Human activity recognition using wearable sensors by deep convolutional neural networks. In: Proceedings of the 23rd ACM International Conference on Multimedia, pp 1307–1310Google Scholar
  16. 16.
    Kalita S, Karmakar A, Hazarika SM (2018) Efficient extraction of spatial relations for extended objects vis-à-vis human activity recognition in video. Appl Intell 48:204–219CrossRefGoogle Scholar
  17. 17.
    Kang J, Kim J, Lee S, Sohn M (2017) Recognition of transition activities of human using CNN-based on overlapped sliding window. In: The 5th International Conference on Big Data Applications and Services (BIGDAS), pp 143–148Google Scholar
  18. 18.
    Kavitha R (2017) Human activity recognition from sensor data using random forest algorithm. Int J Adv Res Comput Sci 8(3):334–337Google Scholar
  19. 19.
    Khan AM, Lee YK, Lee SY, Kim TS (2010) A triaxial accelerometer-based physical-activity recognition via augmented-signal features and a hierarchical recognizer. IEEE Trans Inf Technol Biomed 14(5):1166–1172CrossRefGoogle Scholar
  20. 20.
    Lee SM, Yoon SM, Cho H (2017) Human activity recognition from accelerometer data using convolutional neural network. In: 2017 IEEE International Conference on Big Data and Smart Computing (BigComp), pp 131–134Google Scholar
  21. 21.
    Lester J, Choudhury T, Kern N, Borriello G, Hannaford B (2005) A hybrid discriminative/generative approach for modeling human activities. In: IJCAI International Joint Conference on Artificial Intelligence (pp 766–772)Google Scholar
  22. 22.
    Lyu L, He X, Law YW, Palaniswami M (2017) Privacy-preserving collaborative deep learning with application to human activity recognition. In: Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, pp 1219–1228Google Scholar
  23. 23.
    Núñez JC, Cabido R, Pantrigo JJ, Montemayor AS, Vélez JF (2018) Convolutional neural networks and long short-term memory for skeleton-based human activity and hand gesture recognition. Pattern Recogn 76:80–94CrossRefGoogle Scholar
  24. 24.
    Reyes-Ortiz JL, Oneto L, Samà A, Parra X, Anguita D (2016) Transition-aware human activity recognition using smartphones. Neurocomputing 171:754–767CrossRefGoogle Scholar
  25. 25.
    Rodríguez ND, Cuéllar MP, Lilius J, Calvo-Flores MD (2014) A fuzzy ontology for semantic modelling and recognition of human behaviour. Knowl-Based Syst 66:46–60CrossRefGoogle Scholar
  26. 26.
    Ronao CA, Cho SB (2014) Human activity recognition using smartphone sensors with two-stage continuous hidden Markov models. In: 2014 10th International Conference on Natural Computation (ICNC), pp 681–686Google Scholar
  27. 27.
    Roy N, Misra A, Cook D (2016) Ambient and smartphone sensor assisted ADL recognition in multi-inhabitant smart environments. J Ambient Intell Humaniz Comput 7(1):1–19CrossRefGoogle Scholar
  28. 28.
    Sheng M, Jiang J, Su B, Tang Q, Yahya AA, Wang G (2016) Short-time activity recognition with wearable sensors using convolutional neural network. In: Proceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry-Volume 1, pp 413–416Google Scholar
  29. 29.
    Shotton J, Sharp T, Kipman A, Fitzgibbon A, Finocchio M, Blake A et al (2013) Real-time human pose recognition in parts from single depth images. Commun ACM 56(1):116–124CrossRefGoogle Scholar
  30. 30.
    Song JJ, Lee W (2017) Relevance maximization for high-recall retrieval problem: finding all needles in a haystack. J Supercomput. Google Scholar
  31. 31.
    Sousa W, Souto E, Rodrigres J, Sadarc P, Jalali R, El-Khatib K (2017) A comparative analysis of the impact of features on human activity recognition with smartphone sensors. In: Proceedings of the 23rd Brazilian Symposium on Multimedia and the Web, pp 397–404Google Scholar
  32. 32.
    Tang W, Sazonov ES (2014) Highly accurate recognition of human postures and activities through classification with rejection. IEEE J Biomed Health Inform 18(1):309–315CrossRefGoogle Scholar
  33. 33.
    Vishwakarma DK, Kapoor R (2015) Hybrid classifier based human activity recognition using the silhouette and cells. Expert Syst Appl 42(20):6957–6965CrossRefGoogle Scholar
  34. 34.
    Vishwakarma DK, Kapoor R, Maheshwari R, Kapoor V, Raman S (2015) Recognition of abnormal human activity using the changes in orientation of silhouette in key frames. In: 2015 2nd International Conference on Computing for Sustainable Global Development (INDIACom), pp 336–341Google Scholar
  35. 35.
    Yang J, Lee J, Choi J (2011) Activity recognition based on RFID object usage for smart mobile devices. J Comput Sci Technol 26(2):239–246CrossRefGoogle Scholar
  36. 36.
    Yazdansepas D, Niazi AH, Gay JL, Maier FW, Ramaswamy L, Rasheed K, Buman MP (2016) A multi-featured approach for wearable sensor-based human activity recognition. In: 2016 IEEE International Conference on Healthcare Informatics (ICHI), pp 423–431Google Scholar
  37. 37.
    Zebin T, Scully PJ, Ozanyan KB (2016) Human activity recognition with inertial sensors using a deep learning approach. In: SENSORS, 2016. IEEE, pp 1–3Google Scholar
  38. 38.
    Zeng M, Nguyen LT, Yu B, Mengshoel OJ, Zhu J, Wu P, Zhang J (2014) Convolutional neural networks for human activity recognition using mobile sensors. In: 2014 6th International Conference on Mobile Computing, Applications and Services (MobiCASE), pp 197–205Google Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Department of Industrial EngineeringSungkyunkwan UniversitySuwonKorea

Personalised recommendations