Advertisement

Gestures as Interface for a Home TV Digital Divide Solutions through Inertial Sensors

  • Stefano Pinardi
  • Matteo Dominoni
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8482)

Abstract

Seniors are the fastest growing segment of populations not only in many parts of Europe, but also in Japan and the United States. ICT technologies are not very popular among many elderly and also are not designed around their cultural necessities and ergonomic needs. The risk is that in the very near future this growing segment will be digitally isolated, in a society that is more and more based on ICT as infrastructure for service, and communications.

Easy Reach Project proposes an ergonomic application to break social isolation through social interaction to help the elderly to overcome barrier of the digital divide. This paper focuses its attention on the development of the technology and algorithms used as Human Computer Interface of the Easy Reach Project, that exploits inertial sensors to detect gestures.

Many experimental algorithms for gesture recognition have been developed using inertial sensors in conjunction with other sensors or devices, or by themselves, but they have not been thoroughly tested in real situations, they are not devoted to adapt to the elderly and their way of executing gestures. The elderly are not used to modern interfaces and devices, and – due to aging – they can face problems in executing even very simple gestures.

Our algorithm based on Pearson index and Hamming distance for gestures recognition has been tested both with young and elderly, and was shown to be resilient to changes in velocity and individual differences, still maintaining great accuracy of recognition (97.4% in user independent mode; 98.79% in user dependent mode). The algorithm has been adopted by the Easy Reach consortium (2009-2013) to pilot the human machine gesture-based interface.

Keywords

Sensors Inertial Sensors Gestures Human Machine Interfaces Ambient Intelligence Assisted Living Elderly Social Digital Divide Home TV 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bisiani, R., Merico, D., Pinardi, S., Dominoni, M., Cesta, A., Orlandini, A., Rasconi, R., Suriano, M., Umbrico, A., Sabuncu, O., Schaub, T., D’Aloisi, D., Nicolussi, R., Papa, F., Bouglas, V., Giakas, G., Kavatzikidis, T., Bonfiglio, S.: Fostering Social Interaction of Home-Bound Elderly People: The EasyReach System. In: IEA/AIE 2013, Amsterdam (2013)Google Scholar
  2. 2.
    Hans-Helmut, K., Dirk, H., Thomas, L., Steffi, G.R.-H., Matthias, C.A., Herbert, M., Vilagut, G., Ronny, B., Josep, M.H., Giovanni, D.G., Ron, D.G., Viviane, K., Jordi, A.: Health status of the advanced elderly in six european countries: Results from a representative survey using EQ-5D and SF-12. Health and Quality of Life Outcomes 2010 143(8), 143 (2010)Google Scholar
  3. 3.
    Hoffman, F.G., Heyer, P., Hommel, G.: Velocity Profile Based Recognition of Dynamic Gestures with Discrete Hidden Markov Models (1996)Google Scholar
  4. 4.
    Mäntylä, V.M., Mäntyjärvi, J., Seppänen, T., Tuulari, E.: Hand gesture recognition of a mobile device user. IEEE (2000)Google Scholar
  5. 5.
    Schlömer, T., Poppinga, B., Henze, N., Boll, S.: Gesture Recognition with a Wii Controller. In: Proceedings of the Second International Conference on Tangible and Embedded Interaction, Bonn (2008)Google Scholar
  6. 6.
    Prekopcsák, Z.: Accelerometer Based Real-Time Gesture Recognition. Poster (2008)Google Scholar
  7. 7.
    Cho, S.J., Oh, J.K., Bang, W.C., Chang, W., Choi, E., Jing, Y., Cho, J., Kim, D.Y.: Magic Wand: A Hand-Drawn Gesture Input Device in 3-D Space with Inertial Sensors. In: Proceedings of the 9th Int’l Workshop on Frontiers in Handwriting Recognition (2004)Google Scholar
  8. 8.
    Wu, J., Pan, G., Zhang, D., Qi, G., Li, S.: Gesture Recognition with a 3-D Accelerometer. In: Zhang, D., Portmann, M., Tan, A.-H., Indulska, J. (eds.) UIC 2009. LNCS, vol. 5585, pp. 25–38. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  9. 9.
    Kratz, S., Rohs, M.: A $3 Gesture Recognizer – Simple Gesture Recognition for Devices Equipped with 3D Acceleration Sensors. ACM (2010)Google Scholar
  10. 10.
    Chen, M., AlRegib, G., Juang, B.: A new 6D motion gesture database and the benchmark results of feature-based statistical recognition (2011)Google Scholar
  11. 11.
    Pinardi, S., Bisiani, R.: Movements Recognition with Intelligent Multisensor Analysis, A Lexical Approach. In: Proceedings of the 6th Int. Conf. on Intelligent Environments, Kuala Lumpur (2010)Google Scholar
  12. 12.
    Gupta, S., Morris, D., Patel, S.N., Desney, T.: SoundWave: Using the Doppler Effect to Sense Gestures. Redmond (2012)Google Scholar
  13. 13.
    Xu, R., Zhou, S., Li, W.J.: MEMS Accelerometer Based Nonspecific-User Hand Gesture Recognition. IEEE Sensors Journal (May 5, 2012)Google Scholar
  14. 14.
    Kratz, L., Saponas, T.S., Morris, D.: Making Gestural Input from Arm-Worn Inertial Sensors More Practical. ACM (2012)Google Scholar
  15. 15.
    XSens, XM-B Technical Documentation (2009) Google Scholar
  16. 16.
    STMicroelectronics, LIS331DLH - MEMS digital output motion sensor ultra low-power high performance 3-axes “nano” accelerometer (2009)Google Scholar
  17. 17.
    Zhou, S., Dong, Z., Li, W.J., Kwong, C.P.: Hand-Written Character Recognition Using MEMS Motion Sensing Technology. In: Proceedings of the 2008 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (2008)Google Scholar
  18. 18.
    Keir, P., Elgoyhen, J., Naef, M., Payne, J., Horner, M., Anderson, P.: Gesture-recognition with Non-referenced Tracking. In: Proceedings of the 2006 IEEE Symposium on 3D User interfaces (2006)Google Scholar
  19. 19.
    Fihl, P., Holte, M., Moeslund, T., Reng, L.: Action Recognition using Motion Primitives and Probabilistic Edit Distance (2006)Google Scholar
  20. 20.
    Pylvänäinen, T.: Accelerometer Based Gesture Recognition Using Continuous HMMs. In: Marques, J.S., Pérez de la Blanca, N., Pina, P. (eds.) IbPRIA 2005. LNCS, vol. 3522, pp. 639–646. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  21. 21.
    Tuulari, E., Ylisaukko-oja, A.: SoapBox: A Platform for Ubiquitous Computing Research and Applications. In: Mattern, F., Naghshineh, M. (eds.) PERVASIVE 2002. LNCS, vol. 2414, pp. 125–138. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  22. 22.
    Vogler, C., Sun, H., Metaxas, D.: A Framework for Motion Recognition with Applications to American Sign Language and Gait Recognition. IEEE (2000)Google Scholar
  23. 23.
    Gavrila, D.M.: The Visual Analisys of Human Movement: A Survey. Academic Press (1998)Google Scholar
  24. 24.
    Wang, J.S., Chuang, F.C.: An Accelerometer-Based Digital Pen With a Trajectory Recognition Algorithm for Handwritten Digit and Gesture Recognition. IEEE (2011)Google Scholar
  25. 25.
    Choi, E., Bang, W., Cho, S., Yang, J., Kim, D., Kim, S.: Beatbox Music Phone: Gesture-based Interactive Mobile Phone using a Tri-axis Accelerometer. IEEE (2005)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Stefano Pinardi
    • 1
  • Matteo Dominoni
    • 1
  1. 1.D.I.S.Co. University of Milano-BicoccaItaly

Personalised recommendations