Abstract
The natural user interface/experience (NUI/NUX) is used for a natural motion interface without using a device or tool such as a mouse, keyboard, pen, or marker. Up to now, typical motion recognition methods have used markers to receive coordinate input values as relative data and store them in a database. However, to recognize accurate motion, more markers are needed, and much time is required to attach the makers and process the data. In addition, because the NUI/NUX framework is developed using only basic intuition, use problems arise, which force users to learn many NUI/NUX framework usages. To compensate for this problem, in this paper, we design a multi-modal NUI/NUX framework controlled by voice, gesture motion, and facial expression simultaneously, and propose a new algorithm for mouse operations by analyzing intuitive hand gestures and mapping them on the monitor. We also implement a “dynamic mouse area,” which enables people of all ages to handle the “hand mouse” operation easily and intuitively.
Similar content being viewed by others
References
Bau O, E. Mackay W (2008) OctoPocus: a dynamic guide for learning gesture-based command sets. Proceedings of the 21st Annual ACM Symposium on Unser Interface Software and Technology, pp. 37–46
Chaing I, Tsai J (2012) Using Xbox360 Kinect games on enhancing visual performance skills on institutionalized older adults with wheelchairs. Proceedings of the 4th IEEE Int’l Conference On Digital Game and Intelligent Toy Enhanced Learning (DIGITEL 2012), pp.263-267
Chang S, Chang H, Yen S, Shih T (2013) Panoramic human structure maintenance based on invariant feature of video frames. Human-centric Computing and Information Sciences 3:14, doi:10.1186/2192-1962-3-14
Cho H, Choi M (2014) Personal mobile album/diary application development. J Converg 5(1):32–37
Christou G (2013) A comparison between experienced and inexperienced video game players’ perceptions. Human-centric Computing and Information Sciences 3:15, doi:10.1186/2192-1962-3-15
Ghimire D, Lee J (2013) A robust face detection method based on skin color and edges. J Inf Process Syst 9(1):141–156
Ghimire D, Lee J (2014) Extreme learning machine ensemble using bagging for facial expression recognition. J Inf Process Syst 10(3):443–458
Goth G (2012) Brave NUI world. Commun ACM 54(12):14–16
Henrique C, Forster Q (2007) Design of Gesture Vocabularies through Analysis of Recognizer Performance in Gesture Space. Intelligent Systems Design and Applications, pp.641-646
Ince I, Socarras-Garzon M, Yang T (2010) Hand mouse: real time hand motion detection system based on analysis of finger blobs. Int J Digit Content Technol Appl 4(2):40–56
Jeon I, Nam B (2012) Implementation of hand mouse based on depth sensor of the Kinect. Proceeding of the 43rd KIEE (The Korean Institute of Electrical Engineers) Annual Summer Conference, pp. 18–20.
Kjeldsen R, Kender J (1996) Toward the use of gesture in traditional user interfaces. Proceedings of the 2nd International Conference on Automatic Face and Gesture Recognition, pp.151-156.
Nagi G, Rahmat R, Khalid F, Taufik M (2013) Region-based facial expression recognition in still images. J Inf Process Syst 9(1):173–188
Ng C, Fam J, Ee G, Noordin N (2013) Finger triggered virtual musical instruments. J Converg 4(1):39–46
Oh J, Kim H, Moon H (2014) A study on the diffusion of digital interactive e-books - the development of a user experience model. J Converg 5(2):21–27
Ohya J, Kitamura Y (1993) Real-time reproduction of 3D human images in virtual space teleconferencing. Proceedings of the 1993 I.E. Virtual Reality Annual International Symppsium, pp. 408–414.
Richard A (1980) Put-That-There: voice and gesture at the graphics interface. International Conference on Computer Graphics and Interactive Techniques, Association for Computer Machinery, pp. 262–270.
Sanchez-Nielsen E, Anton-Canalis L, Guerra-Artal C (2005) An autonomous and user-independent hand posture recognition system for vision-based interface tasks. Proceedings of the 11th Conference of the Spanish Association for Artificial Intelligence, pp. 113–122
Shahabi C, Kim S, Nocera L, Constantinou G, Lu Y, Cai Y, Medioni G, Nevatia R, Banaei-Kashani F (2014) Janus – multi source event detection and collection system for effective surveillance of criminal activity. J Inf Process Syst 10(1):1–22
Shiratuddin M, Wong K (2011) Non-contact multi-hand gestures interaction techniques for architectural design in a virtual environment. Proceedings of the International Conference on IT & Multimedia at UNITEN (ICIMU 2011), pp 1–6
Verma P, Singh R, Singh A (2013) A framework to integrate speech based interface for blind web users on the websites of public interest. Human-centric Computing and Information Sciences 3:21, doi:10.1186/2192-1962-3-21
Virpi R, Effie L, Arnold V, Jettie H (2011) USER EXPERIENCE WHITE PAPER. Bringing clarity to the concept of user experience, Result from Dagstuhl Seminar on Demarcating User Experience, Feburary 11
Acknowledgment
This research was supported by the MSIP (Ministry of Science, ICT and Future Planning), Korea, under the ITRC (Information Technology Research Center)) support program (NIPA-2013-H0301-13-4007) supervised by the NIPA (National IT Industry Promotion Agency)”
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Lee, G., Shin, D. & Shin, D. Mouse operation on monitor by interactive analysis of intuitive hand motions. Multimed Tools Appl 75, 15261–15274 (2016). https://doi.org/10.1007/s11042-014-2357-8
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-014-2357-8