Multiple Scales Pilot Action Pattern Recognition During Flight Task Using Video Surveillance

  • Lu DingEmail author
  • Jia Bo
  • Qi Wu
  • HaiYan Liu
  • Shan Fu
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 529)


Pilot action analysis is one of the most important aspects in objective measurement of flight crew workload. Pilot action patterns include distribution of action area, action duration, time and space interval between each of the actions, body posture, cross actions, action path and operation procedures. There are two main problems of pilot action pattern recognition. One is the issue of multiple scales in both time and space. The other is to locate action accurately. This paper presents a method of analyzing pilot action patterns during flight tasks by using video surveillance. Video data is obtained by setting up a single camera in real flight mission. We consider the cockpit as an intelligent environment and develop a new method to analyze pilot action patterns. The concept of intelligent environment is included. First, vision based pattern recognition method is used to locate moving targets in video. Then we use logical based method to recognition actions. The experiment results show that our approach is effective in workload assessment.


Multiple scale Activity recognition Ontology Pilot action patterns 


  1. 1.
    Hart, S.G., Staveland, L.E.: Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. Adv. Psychol. 52, 139–183 (1988)CrossRefGoogle Scholar
  2. 2.
    Chen, L., Khalil, I.: Activity recognition: approaches, practices and trends. In: Activity Recognition in Pervasive Intelligent Environments, pp. 1−31, Atlantis Press, (2011)Google Scholar
  3. 3.
    Roscoe, A.H.: Assessing pilot workload. why measure heart rate, HRV and respiration? Biol. Psychol. 34(2), 259–287 (1992)CrossRefGoogle Scholar
  4. 4.
    Bakker, N.H., Tanase, D., Reekers, J.A., et al.: Evaluation of vascular and interventional procedures with time–action analysis: a pilot study. J. Vasc. Interv. Radiol. 13(5), 483–488 (2002)CrossRefGoogle Scholar
  5. 5.
    Chen, L., Nugent, C.: Ontology-based activity recognition in intelligent pervasive environments. Int. J. Web Inf. Syst. 5(4), 410–430 (2009)CrossRefGoogle Scholar
  6. 6.
    Chen, L., Nugent, C.D., Biswas, J., Hoey, J.: Activity Recognition in Pervasive Intelligent Environments. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  7. 7.
    Singla, G., Cook, D.J., Schmitter-Edgecombe, M.: Recognizing independent and joint activities among multiple residents in smart environments. J. Ambient Intell. Humaniz. Comput. 1(1), 57–63 (2010)CrossRefGoogle Scholar
  8. 8.
    Gu, T., Wang, X.H., Pung, H.K., et al.: An ontology-based context model in intelligent environments. In: Proceedings of Communication Networks and Distributed Systems Modeling and Simulation Conference, vol. 2004, pp. 270–275 (2004)Google Scholar
  9. 9.
    Cai, L., He, L., Xu, Y., et al.: Multi-object detection and tracking by stereo vision. Pattern Recogn. 43(12), 4028–4041 (2010)CrossRefGoogle Scholar
  10. 10.
    Kalal, Z., Mikolajczyk, K., Matas, J.: Tracking-learning-detection. IEEE Trans. Pattern Anal. Mach. Intell. 34(7), 1409–1422 (2012)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Shanghai Jiao Tong UniversityShanghaiChina
  2. 2.Commercial Aircraft Corporation of China LtdShanghaiChina

Personalised recommendations