Advertisement

Intelligent Classroom with Motion Sensor and 3D Vision for Virtual Reality e-Learning

  • Chian-Hsueng ChaoEmail author
  • Ying-Chen Chen
  • Tsung-Jung Yang
  • Pei-Lun Yu
Conference paper
Part of the Springer Proceedings in Complexity book series (SPCOM)

Abstract

e-learning is getting more and more popular. In certain situation classes need to show 3D stereographs to illustrate subjects, such as geographic or mathematics. Our study is to help teachers to create the 3D stereographs they need in an easy and intuitive way. “3T in 3D” is a 3D motion sensor teaching system. A motion sensor and with 3D technologies, 3D objects can be manipulated by teachers and students. In this way, students can easily to understand the complex 3D objects. This is a preliminary study on the applications of motion sensor with 3D technologies in e-learning. It is hope that the data gathered in this study will help to develop a more complete 3D e-learning system.

Keywords

e-Learning v-Learning Motion sensor Virtual reality 3D stereograph kinect 

References

  1. 1.
    Sun P-C et al (2008) What drives a successful e-learning? An EMPIRICAL investigation of the critical factors influencing learner satisfaction. Comput Educ 50(4):1183–1202CrossRefGoogle Scholar
  2. 2.
    Tomanová J, Martin C (2010) E-learning support for computer graphics teaching and testing. In: Proceedings of the 9th WSEAS international conference on telecommunications and informatics. World Scientific and Engineering Academy and Society (WSEAS)Google Scholar
  3. 3.
    Martin S et al. (2009) Middleware for the development of context-aware applications inside m-Learning: Connecting e-learning to the mobile world. In: IEEE computing in the Global Information Technology, 2009. ICCGI’09. Fourth International multi-conferenceGoogle Scholar
  4. 4.
    Annetta LA, Folta E, Klesath M (2010) V-Learning: distance education in the 21st century through 3D virtual learning environments. Springer, New YorkGoogle Scholar
  5. 5.
    Panichi L (2011, November) Virtual worlds: an opportunity for thinking about learning. In: Proceedings from the international conference learning a language in virtual worlds. A review of innovation and ICT in language teaching methodology, vol. 17,.Warsaw, Poland, pp 25–32Google Scholar
  6. 6.
    Tick A (2011) A new direction in the learning processes, the road from eLearning to v-learning. In: 6th IEEE International Symposium applied computational intelligence and informatics (SACI), pp 359–362Google Scholar
  7. 7.
    Lee MC (2010) Explaining and predicting users’ continuance intention toward e-learning: an extension of the expectation–confirmation model. Comput Educ 54(2):506–516CrossRefGoogle Scholar
  8. 8.
    Villaroman N, Rowe D, Swan B (2011) Teaching natural user interaction using OpenNI and the Microsoft Kinect sensor. In: Proceedings of the 2011 conference on Information technology education, ACM pp 227–232Google Scholar
  9. 9.
    Gallo L, Placitelli AP, Ciampi M (2011) Controller-free exploration of medical image data: experiencing the Kinect. In: IEEE 24th international symposium on computer-based medical systems (CBMS), pp 1–6Google Scholar
  10. 10.
    Suma EA, Lange B, Rizzo A, Krum DM, Bolas M (2011, March) Faast: the flexible action and articulated skeleton toolkit. In: IEEE virtual reality conference (VR), pp 247–248Google Scholar
  11. 11.
    OpenNI Consortium http://www.openni.org/
  12. 12.
    Albanesius C (2012) Kinect for windows 1.5 released with facial, skeletal tacking. PC magazine. http://www.pcmag.com/article2/0,2817,2404694,00.asp

Copyright information

© Springer Science+Business Media Dordrecht 2014

Authors and Affiliations

  • Chian-Hsueng Chao
    • 1
    Email author
  • Ying-Chen Chen
    • 1
  • Tsung-Jung Yang
    • 1
  • Pei-Lun Yu
    • 1
  1. 1.Department of Information ManagementNational University of KaohsiungKaohsiungTaiwan R. O. C

Personalised recommendations