Abstract
Physical disability is one of the significant concerns that hamper individuals to access the web easily. Most of the physically disabled people cannot use technology because of the limitation of accessibility tools and techniques. It is required that the websites should be made compliant with the requirements of every citizen in a country; that’s why they should cater to the needs of the differently-abled citizens as well. Features have to be introduced in the websites so that they are easy to use, readily accessible, understandable, and convenient to everyone including best practices/standards and global innovation techniques. At times, accessibility is confused with providing solutions to disabled people, but the fact is accessibility is not only for differently-abled people, but it’s also there for everyone. The matter is every person needs accessibility and uses it when in need.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Turk, M., Robertson, G.: Perceptual user interfaces. In: Communications of the ACM, vol. 43 (2000)
Marium, A., Rao, D., Crasta, D.R., Acharya, K., D’Souza, R.: Hand gesture recognition using webcam. Am. J. Intell. Syst. 7(3), 90–94 (2017). https://doi.org/10.5923/j.ajis.20170703.11
AL, L., Shao, L.: Learning discriminative representations from RGB-D video data. In: IJCAI (2013)
Naguri, C.R., Bunescu, R.C.: Recognition of dynamic hand gestures from 3D motion data using LSTM and CNN architectures. In: 2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA). IEEE (2017)
Pigou, L., Oord, A.V.D., Dieleman, S., Herreweghe, M.V., Dambre, J.: Beyond temporal pooling: recurrence and temporal convolutions for gesture recognition in video. arXiv (2015)
Turk, M., Hu, C., Feris, R., Lashkari, F., Beall, A.: TLA based face tracking. In: Intern Conference on Vision Interfacec Calgary, pp. 229–235 (2002)
Ruddarraju, R., et al.: Perceptual user interfaces using vision-based eye tracking. In: The 5th International Conference on Multimodal Interfaces, British Columbia, Canada, Vancouver, pp. 227–233 (2003)
Khan, R.Z., Ibraheem, N.: Hand gesture recognition: a literature review. Int. J. Artif. Intell. Appl. (IJAIA). 3, 161–174 (2012). https://doi.org/10.5121/ijaia.2012.3412
Gao, L. Research on Static Gesture Recognition Algorithm Based on Neural Network. Ning Xia University (2017)
Nutipalli, P.: A comparative analysis on hand gesture recognition using deep learning. Int. J. Res. Appl. Sci. Eng. Technol. 8, 1197–1203 (2020). https://doi.org/10.22214/ijraset.2020.6195
Liu, S.: The study concerning human-computer interaction used in the manual segmentation and identification of key techniques. Shan Dong University (2017)
Ballantyne, M., Jha, A., Jacobsen, A., Hawker, J., El-Glaly, Y.: Study of accessibility guidelines of mobile applications, pp. 305–315 (2018). https://doi.org/10.1145/3282894.3282921
Venkateswarlu, L.K., Rambatla, V., Ande, K.: A novel approach to speech recognition by using generalized regression neural networks. Int. J. Comput. Sci. Issues 8, 484–489 (2011)
Ismail, N., Zaman, H.B.: Search Engine Module in Voice Recognition Browser to Facilitate the Visually Impaired in Virtual Learning (MGSYS VISI-VL) (2010)
Ballantyne, M., Jha, A., Jacobsen, A., Hawker, J., El-Glaly, Y.: Study of accessibility guidelines of mobile applications, pp. 305-315 (2018). https://doi.org/10.1145/3282894.3282921
Corbett, E., Weber, A.: What can I say?: addressing user experience challenges of a mobile voice user interface for accessibility, pp. 72–82 (2016). https://doi.org/10.1145/2935334.2935386
Kim, P., Spellman, J., Wahlbin, K.: Mobile Accessibility: How WCAG 2.0 and Other W3C/WAI Guidelines Apply to Mobile (2015)
Young, M., Courtad, C.A., Douglas, K., Chung, Y.-C.: The effects of text-to-speech on reading outcomes for secondary students with learning disabilities. J. Spec. Educ. Technol. 34, 016264341878604 (2018). https://doi.org/10.1177/0162643418786047
Delić, V., Sečujski, M., Sedlar, N.V., Miskovic, D., Mak, R., Bojanic, M.: How Speech Technologies Can Help People with Disabilities (2014). https://doi.org/10.1007/978-3-319-11581-8_30
Zhang, W., Wang, J.: Dynamic hand gesture recognition based on 3D convolutional neural network Models. In: IEEE 16th International Conference on Networking, p. 2019. Sensing and Control (ICNSC). IEEE (2019)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Madaan, H., Gupta, S. (2021). AI Improving the Lives of Physically Disabled. In: Abraham, A., et al. Proceedings of the 12th International Conference on Soft Computing and Pattern Recognition (SoCPaR 2020). SoCPaR 2020. Advances in Intelligent Systems and Computing, vol 1383. Springer, Cham. https://doi.org/10.1007/978-3-030-73689-7_11
Download citation
DOI: https://doi.org/10.1007/978-3-030-73689-7_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-73688-0
Online ISBN: 978-3-030-73689-7
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)