Advertisement

Building a Recognition Process of Cooking Actions for Smart Kitchen System

  • Fong-Gong Wu
  • Tsung-Han Tsai
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8515)

Abstract

Smart kitchen should be focusing its development on the actual interaction with users and the environmental objects rather than emphasizing on complicated instructions and feedback. Unfortunately, the current techniques can only be designed to identify motions and basic actions. The main purpose of this paper is to analyze and research user motions and actions involved in the process of cooking, including ingredient preparation, and to discover multiple action identification characteristics for the user and cooking utensils. By using the video analysis, ultimately, the project will use these characteristics to establish a reliable cooking-action database. Our study can distinguish between similar actions. The model is primarily used to identify, understand and differentiate the extent of the intellectuality of user motions. This model may be used in the future in the application to cooking support systems or other smart kitchen developments.

Keywords

Smart Kitchen Human Behavior Taxonomies Motion analysis Video analysis Decision Tree Learning 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    De Silva, L.C., Morikawa, C., Petra, I.M.: State of the art of smart homes. Engineering Applications of Artificial Intelligence 25(7), 1313–1321 (2012), doi:10.1016/j.engappaiCrossRefGoogle Scholar
  2. 2.
    Ding, D., Cooper, R.A., Pasquina, P.F., Fici-Pasquina, L.: Sensor technology for smart homes. Research Support, U.S. Gov’t, Non-P.H.S. Review. Maturitas 69(2), 131–136 (2011), doi:131-136. doi: 10.1016/j.maturitas. 2011.03.016Google Scholar
  3. 3.
    Hashimoto, A., Funatomi, N.M., Yamakata, T., Kakusho, Y., Minoh, K., Smart Kitchen, M.: Smart Kitchen: A User Centric Cooking Support System. Paper presented at the Proceedings IPMU (2008)Google Scholar
  4. 4.
    Borchers, J.: A Pattern Approach to Interaction Design. In: Gill, S. (ed.) Cognition, Communication and Interaction, pp. 114–131. Springer, London (2008)CrossRefGoogle Scholar
  5. 5.
    Morioka, S., Ueda, H.: Cooking Support System Utilizing Built-in Cameras and Projectors. Paper presented at the The 12th IAPR Conference on Machine Vision Applications. Nara Centennial Hall, Nara (2011)Google Scholar
  6. 6.
    Miyawaki, K., Sano, M.: A Virtual Agent for a Cooking Navigation System Using Augmented Reality. In: Prendinger, H., Lester, J.C., Ishizuka, M. (eds.) IVA 2008. LNCS (LNAI), vol. 5208, pp. 97–103. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  7. 7.
    Chaaraoui, A.A., Climent-Pérez, P., Flórez-Revuelta, F.: A review on vision techniques applied to Human Behaviour Analysis for Ambient-Assisted Living. Expert Systems with Applications 39(12), 10873–10888 (2008), doi:10.1016/j.eswa.2012.03.005CrossRefGoogle Scholar
  8. 8.
    Moeslund, T.B., Hilton, A., Krüger, V.: A survey of advances in vision-based human motion capture and analysis. Computer Vision and Image Understanding 104(2-3), 90–126 (2006), doi:10.1016/j.cviuCrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Fong-Gong Wu
    • 1
  • Tsung-Han Tsai
    • 1
  1. 1.Department of Industrial DesignNational Cheng Kung UniversityTainanTaiwan

Personalised recommendations