Abstract
We have designed and implemented a vision-based system capable of interacting with user’s natural arm and finger gestures. Using depth-based vision has reduced the effect of ambient disturbances such as noise and lighting condition. Various arm and finger gestures are designed and a system capable of detection and classification of gestures is developed and implemented. Finally the gesture recognition routine is linked to a simplified desktop for usability and human factor studies. Several factors such as precision, efficiency, ease-of-use, pleasure, fatigue, naturalness, and overall satisfaction are investigated in detail. Through different simple and complex tasks, it is concluded that finger-based inputs are superior to arm-based ones in the long run. Furthermore, it is shown that arm gestures cause more fatigue and appear less natural than finger gestures. However, factors such as time, overall satisfaction, and easiness were not affected by selecting one over the other.
Keywords
- Usability study
- human factors
- arm/finger gestures
- WIMP
Download conference paper PDF
References
Marcel, S., Bernier, O., Viallet, J.E., Collobert, D.: Hand Gesture Recognition Using Input-Output Hidden Markov Models. In: Conference on Automatic Face and Gesture Recognition (2000)
Chen, F., Fu, C., Huang, C.: Hand Gesture Recognition Using a Real-Time Tracking Method and Hidden Markov Models. Image and Vision Computing, 745–758 (2003)
Liu, Y., Zhang, P.: Vision-Based Human-Computer System Using Hand Gestures. International Conference on Computational Intelligence and Security, 529-532 (2009)
Yu, C., Wang, X., Huang, H., Shen, J., Wu, K.: Vision-Based Hand Gesture Recognition Using Combinational Features. In: Sixth International Conference on Intelligent Information Hiding and Multimedia Signal Processing, pp. 543–546 (2010)
Raheja, J.L., Shyam, R., Kumar, U., Prasad, P.B.: Real-Time Robotic Hand Control Using Hand Gestures. In: Second International Conference on Machine Learning and Computing, Bangalore, India, pp. 12–16 (2010)
Chen, Q., Cordea, M.D., Petriu, E.M., Varkonyi-Koczy, A.R., Whalen, T.E.: Human-Computer Interaction for Smart Environment Applications Using Hand-Gesture and Facial-Expressions. International Journal of Advanced Media and Communication 3(1/2), 95–109 (2009)
Cabral, M.C., Morimoto, C.H., Zuffo, M.K.: On the Usability of Gesture Interfaces in Virtual Reality Environments. In: Proc. of the Latin American Conf. on Human-Computer Interaction, Cuernavaca, Mexico, pp. 100–108 (2005)
Villaroman, N., Rowe, D., Swan, B.: Teaching Natural User Interaction Using OpenNI and the Microsoft Kinect Sensor. In: Proc. of the 2011 Conference on Information Technology Education, West Point, New York, USA, pp. 227–232 (2011)
Kang, J., Seo, D., Jung, D.: A Study on the Control Method of 3-Dimensional Space Application Using KINECT System. International Journal of Computer Science and Network Security (2011)
Bragdon, A., DeLine, R., Hinckley, K., Morris, M.R.: Code Space: Touch+Air Gesture Hybrid Interactions for Supporting Developer Meetings. In: Proc. of ACM International Conference on Interactive Tabletops and Surfaces, Kobe, Japan (2011)
Bhuiyan, M., Picking, R.: A Gesture Controlled User Interface for Inclusive Design and Evaluative Study of Its Usability. Journal of Software Engineering and Applications (2011)
Ebert, L.C., Hatch, G., Ampanozi, G., Thali, M.J., Ross, S.: You Can’t Touch This: Touch-free Navigation Through Radiological Images. SAGE Publications (2011)
Van Dam, A.: Post-WIMP User Interfaces. Communications of the ACM 40(2), 63–67 (1997)
Vatavu, R., Pentiuc, S., Chaillou, C.: On Natural Gestures for Interacting in Virtual Environments. Advances in Electrical and Computer Engineering 5(24), 72–79 (2005)
Pavlovic, V., Huang, T.: Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(7), 667–693 (1997)
Farhadi-Niaki, F., GhasemAghaei, R., Arya, A.: Empirical Study of a Vision-based Depth-Sensitive Human-Computer Interaction System. In: 10th Asia Pacific Conference on Computer Human Interaction, Matsue, Japan, pp. 101–108 (2012)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Farhadi-Niaki, F., Etemad, S.A., Arya, A. (2013). Design and Usability Analysis of Gesture-Based Control for Common Desktop Tasks. In: Kurosu, M. (eds) Human-Computer Interaction. Interaction Modalities and Techniques. HCI 2013. Lecture Notes in Computer Science, vol 8007. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-39330-3_23
Download citation
DOI: https://doi.org/10.1007/978-3-642-39330-3_23
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-39329-7
Online ISBN: 978-3-642-39330-3
eBook Packages: Computer ScienceComputer Science (R0)