Abstract
A significant proportion of the world population is living with some form of disability. The most isolated section of disability spectrum are persons with severe speech and motor impairment (SSMI). They face challenges not only in everyday activities but in accessing education and proper employment in later stages of life. Persons with SSMI must rely on manual gaze-based interactions for everyday communication. This research aims to enable and support the learning and education of persons with SSMI through engaging eye-gaze-controlled playful activities. This work proposes and evaluates gaze-based interaction to drive a toy car using augmented and mixed reality interfaces. The results show improvements in otherwise dull, gaze-based generic pointing and selection tasks after performing an unrelated yet more engaging and playful car driving activity.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Alsharif, S., Kuzmicheva, O., Gräser, A.: Gaze Gesture-Based Human Robot Interface, ZweitetransdisziplinäreKonferenz,TechnischeUnterstützungssysteme, die Menschen wirklichwollen (2016)
Murthy, L.R.D., Mukhopadhyay, A., Biswas, P.: Distraction detection in automotive environment using appearance-based gaze estimation. In: ACM International Conference on Intelligent User Interfaces (IUI 2022) (2022)
Hsieh, H.C.: Effects of ordinary and adaptive toys on pre-school children with developmental disabilities. Res. Dev. Disabil. 29(5), 459–466 (2008)
Prabuwono, A.S., Allehaibi, K.H.S., Kurnianingsih, K.: Assistive robotic technology: a review. Comput. Eng. Appl. J. 6(2), 71–78 (2017)
Petrie, H., Carmien, S., & Lewis, A.: Assistive technology abandonment: research realities and potentials. In: International Conference on Computers Helping People With Special Needs, pp. 532–540. Springer, Cham (2018)
Murthy, L.R.D., Biswas, P.: Appearance-based Gaze Estimation using Attention and Difference Mechanism. In: 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 3137–3146. IEEE (2021)
Ray, D.C., Lee, K.R., Meany-Walen, K.K., Carlson, S.E., Carnes-Holt, K.L., Ware, J.N.: Use of toys in child-centered play therapy. Int. J. Play Ther. 22(1), 43 (2013)
Sposito, A.M.P., et al.: Puppets as a strategy for communication with Brazilian children with cancer. Nurs. Health Sci. 18(1), 30–37 (2016)
Patrizia, M., Claudio, M., Leonardo, G., Alessandro, P.: A robotic toy for children with special needs: From requirements to design. In: 2009 IEEE International Conference on Rehabilitation Robotics, pp. 918–923. IEEE (2009)
Kim, E., Paul, R., Shic, F., Scassellati, B.: Bridging the research gap: Making HRI useful to individuals with autism (2012)
Uncular, D., Artut, S.: Design of a robotic toy and user interfaces for autism spectrum disorder risk assessment. Uludağ Üniversitesi Fen-Edebiyat Fakültesi Sosyal Bilimler Dergisi 20(36), 101–140 (2018)
Robins, B., et al.: Human-centred design methods: Developing scenarios for robot assisted play informed by user panels and field trials. Int. J. Hum Comput Stud. 68(12), 873–898 (2010)
Gardeazabal, X., Abascal, J.: Use of robots for play by children with cerebral palsy. Multidiscip. Digit. Pub. Inst. Proc. 31(1), 75 (2019)
Swaminathan, M., Pal, J.: Ludic design for accessibility in the global south. In: M. Stein and J. Lazar. Assistive Technology and the Developing World. Oxford University Press (2020). https://www.microsoft.com/en-us/research/publication/ludic-design-for-accessibility
Zhang, X., Sugano, Y., Fritz, M., Bulling, A.: It's written all over your face: Full-face appearance-based gaze estimation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp. 51–60 (2017)
Murthy, L.R.D., Mukhopadhyay, A., Anand, K., Aggarwal, S., Biswas, P.: PARKS-Gaze - a precision-focused gaze estimation dataset in the wild under extreme head poses. In: 27th International Conference on Intelligent User Interfaces (IUI 2022 Companion). Association for Computing Machinery, New York, NY, USA, pp. 81–84 (2022). https://doi.org/10.1145/3490100.3516467
Baltrusaitis, T., Zadeh, A., Lim, Y. C., Morency, L. P.: Openface 2.0: Facial behavior analysis toolkit. In: 2018 13th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2018), pp. 59–66. IEEE (2018)
Acknowledgement
We thank Mr. Sunil Nahar for Nahar Center of Robotics and Prototyping Lab and the Director and staff of Vidyasagar School, Chennai for their inputs and cooperation.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Sharma, V.K., Murthy, L.R.D., Biswas, P. (2022). Enabling Learning Through Play: Inclusive Gaze-Controlled Human-Robot Interface for Joystick-Based Toys. In: Cavallo, F., et al. Social Robotics. ICSR 2022. Lecture Notes in Computer Science(), vol 13818. Springer, Cham. https://doi.org/10.1007/978-3-031-24670-8_40
Download citation
DOI: https://doi.org/10.1007/978-3-031-24670-8_40
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-24669-2
Online ISBN: 978-3-031-24670-8
eBook Packages: Computer ScienceComputer Science (R0)