Abstract
Gesture-controlled interfaces are becoming increasingly popular with the growing use of Internet of Things (IoT) systems. In particular, in automobiles, smart homes, computer games and Augmented Reality (AR)/Virtual Reality (VR) applications, gestures have become prevalent due to their accessibility to everyone. Designers, producers, and vendors integrating gesture interfaces into their products have also increased in numbers, giving rise to a greater variation of standards in utilizing them. This variety can confuse a user who is accustomed to a set of conventional controls and has their own preferences. The only option for a user is to adjust to the system even when the provided gestures are not intuitive and contrary to a user’s expectations.
This paper addresses the problem of the absence of a systematic analysis and description of gestures and develops an ontology which formally describes gestures used in Human Device Interactions (HDI). The presented ontology is based on Semantic Web standards (RDF, RDFS and OWL2). It is capable of describing a human gesture semantically, along with relevant mappings to affordances and user/device contexts, in an extensible way.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
- 3.
- 4.
- 5.
- 6.
The prefixes denoted in the figure are: sosa: <http://www.w3.org/ns/sosa/>, time: <http://www.w3.org/2006/time#>, prov: <http://www.w3.org/ns/prov#>, fma: <http://purl.org/sig/ont/fma/>.
- 7.
- 8.
- 9.
See https://unity.com/.
- 10.
- 11.
- 12.
References
Allen, J.F.: Maintaining knowledge about temporal intervals. Commun. ACM 26(11), 832–843 (1983)
Brown, D.C., Blessing, L.: The relationship between function and affordance. In: ASME 2005 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, American Society of Mechanical Engineers Digital Collection, pp. 155–160 (2005)
Choi, E., Kim, H., Chung, M.K.: A taxonomy and notation method for three-dimensional hand gestures. Int. J. Ind. Ergon. 44(1), 171–188 (2014)
Haller, A., et al.: The modular SSN ontology: a joint W3C and OGC standard specifying the semantics of sensors, observations, sampling, and actuation. Semanti. Web 10(1), 9–32 (2019). https://doi.org/10.3233/SW-180320
Janowicz, K., Haller, A., Cox, S.J., Le Phuoc, D., Lefrançois, M.: Sosa: a lightweight ontology for sensors, observations, samples, and actuators. J. Web Semant. 56, 1–10 (2019)
Khairunizam, W., Ikram, K., Bakar, S.A., Razlan, Z.M., Zunaidi, I.: Ontological framework of arm gesture information for the human upper body. In: Hassan, M.H.A. (ed.) Intelligent Manufacturing & Mechatronics. LNME, pp. 507–515. Springer, Singapore (2018). https://doi.org/10.1007/978-981-10-8788-2_45
Maier, J.R., Fadel, G.M.: Affordance-based methods for design. In: ASME 2003 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, American Society of Mechanical Engineers Digital Collection, pp. 785–794 (2003)
McNeill, D.: Hand and Mind: What Gestures Reveal About Thought. University of Chicago press (1992)
Mitra, S., Acharya, T.: Gesture recognition: a survey. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 37(3), 311–324 (2007)
Morris, M.R., et al.: Reducing legacy bias in gesture elicitation studies. Interact. 21(3), 40–45 (2014)
Norman, D.A.: The Psychology of Everyday Things. Basic books (1988)
Ousmer, M., Vanderdonckt, J., Buraga, S.: An ontology for reasoning on body-based gestures. In: Proceedings of the ACM SIGCHI Symposium on Engineering Interactive Computing Systems, pp. 1–6 (2019)
Riener, A.: Gestural interaction in vehicular applications. Comput. 45(4), 42–47 (2012)
Rosse, C., Mejino, J.: The Foundational Model of Anatomy Ontology. In: Burger, A., Davidson, D., Baldock, R. (eds.) Anatomy Ontologies for Bioinformatics: Principles and Practice, vol. 6, pp. 59–117. Springer, London (2007). https://doi.org/10.1007/978-1-84628-885-2_4
Scoditti, A., Blanch, R., Coutaz, J.: A novel taxonomy for gestural interaction techniques based on accelerometers. In: Proceedings of the 16th international conference on Intelligent user interfaces, pp. 63–72 (2011)
Stephanidis, C., et al.: Seven HCI grand challenges. Int. J. Hum. Comput. Interact. 35(14), 1229–1269 (2019)
Villarreal Narvaez, S., Vanderdonckt, J., Vatavu, R.D., Wobbrock, J.O.: A systematic review of gesture elicitation studies: what can we learn from 216 studies? In: ACM conference on Designing Interactive Systems (DIS’20) (2020)
Wobbrock, J.O., Aung, H.H., Rothrock, B., Myers, B.A.: Maximizing the guessability of symbolic input. In: CHI’05 extended abstracts on Human Factors in Computing Systems, pp. 1869–1872 (2005)
Yang, L., Huang, J., Feng, T., Hong-An, W., Guo-Zhong, D.: Gesture interaction in virtual reality. Virtual Reality Intell. Hardware. 1(1), 84–112 (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Perera, M., Haller, A., Rodríguez Méndez, S.J., Adcock, M. (2020). HDGI: A Human Device Gesture Interaction Ontology for the Internet of Things. In: Pan, J.Z., et al. The Semantic Web – ISWC 2020. ISWC 2020. Lecture Notes in Computer Science(), vol 12507. Springer, Cham. https://doi.org/10.1007/978-3-030-62466-8_8
Download citation
DOI: https://doi.org/10.1007/978-3-030-62466-8_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-62465-1
Online ISBN: 978-3-030-62466-8
eBook Packages: Computer ScienceComputer Science (R0)