Skip to main content

Structure of Sensory Signals: Icons and Messages

  • Chapter
Cognitive Infocommunications (CogInfoCom)

Abstract

In this chapter, the motivations behind CogInfoCom channels are discussed. As a first step towards their formal development, a unified view is provided of the structural elements that have in the past been used in interfaces designed for various (human) sensory systems. It is demonstrated that not only are these structural elements analogous to each other, and therefore amenable to conceptual unification, but that their interpretation can also be extended to the artificial modalities of any kind of cognitive entity in general.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Perhaps this explains why some researchers have not allowed the ethymological structure of their terminologies to influence their interpretation.

References

  • Balata J, Franc J, Mikovec Z, Slavik P (2014) Collaborative navigation of visually impaired. J Multimodal User Interfaces 8:175–185

    Article  Google Scholar 

  • Baranyi P, Galambos P, Csapo A, Varlaki P (2012) Stabilization and synchronization of dynamicons through CogInfoCom channels. In: 2012 IEEE 3rd international conference on cognitive infocommunications (CogInfoCom), pp 33–36

    Google Scholar 

  • Berthoz A, Pavard B, Young LR (1975) Perception of linear horizontal self-motion induced by peripheral vision (linearvection) basic characteristics and visual-vestibular interactions. Exp Brain Res 23(5):471–489

    Article  Google Scholar 

  • Blattner M, Sumikawa D, Greenberg R (1989) Earcons and icons: their structure and common design principles. Hum Comput Interact 4(1):11–44

    Article  Google Scholar 

  • Blum JR, Eichhorn A, Smith S, Sterle-Contala M, Cooperstock JR (2014) Real-time emergency response: improved management of real-time information during crisis situations. J Multimodal User Interfaces 8:161–173

    Article  Google Scholar 

  • Brewster S, Brown L (2004) Tactons: structured tactile messages for non-visual information display. In: Proceedings of the 5th conference on Australasian user interface (AUIC’04), vol 28. Dunedin, pp 15–23

    Google Scholar 

  • Brunger-Koch M, Briest S, Vollrath M (2006) Virtual driving with different motion characteristics: braking manoeuvre analysis and validation. In: Proceedings of the driving simulation conference, pp 69–78

    Google Scholar 

  • Chernoff H (1973) The use of faces to represent points in k-dimensional space graphically. J Am Stat Assoc 68(342):361–368

    Article  Google Scholar 

  • Csapo A, Baranyi P (2012d) A unified terminology for the structure and semantics of CogInfoCom channels. Acta Polytech Hung 9(1):85–105

    Google Scholar 

  • Csapo A, Wersenyi G (2014) Overview of auditory representations in human-machine interfaces. ACM Comput Surv 46(2)

    Google Scholar 

  • De Groot S, De Winter JCF, Mulder M, Wieringa PA (2011) Nonvestibular motion cueing in a fixed-base driving simulator: effects on driver braking and cornering performance. Presence Teleoperators Virtual Environ 20(2):117–142

    Article  Google Scholar 

  • Enriquez M, MacLean K (2003) The hapticon editor: a tool in support of haptic communication research. In: Proceedings of the 11th symposium on haptic interfaces for virtual environment and teleoperator systems (HAPTICS’03). IEEE Computer Society, Los Angeles, pp 356–362

    Google Scholar 

  • Enriquez M, Maclean K, Chita C (2006) Haptic phonemes: basic building blocks of haptic communication. In: Proceedings of the 8th international conference on multimodal interfaces (ICMI 2006). ACM Press, Banff, pp 302–309

    Google Scholar 

  • Flanagan JR, Vetter P, Johansson RS, Wolpert DM (2003) Prediction precedes control in motor learning. Curr Biol 13(2):146–150

    Article  Google Scholar 

  • Flanagan JR, Bowman MC, Johansson RS (2006) Control strategies in object manipulation tasks. Curr Opin Neurobiol 16(6):650–659

    Article  Google Scholar 

  • Gaver W (1986) Auditory icons: using sound in computer interfaces. Hum Comput Interact 2(2):167–177

    Article  Google Scholar 

  • Gaver W (1988) Everyday listening and auditory icons. Ph.D. thesis, University of California, San Diego

    Google Scholar 

  • Gaver W (1989) The SonicFinder: an interface that uses auditory icons. Hum Comput Interact 4(1):67–94

    Article  MathSciNet  Google Scholar 

  • Gaver WW (1997) Auditory interfaces. In: Handbook of human-computer interaction, vol 1. Elsevier, Amsterdam, pp 1003–1041

    Google Scholar 

  • Hearst MA (1997) Dissonance on audio interfaces. IEEE Expert 12(5):10–16

    Article  Google Scholar 

  • Hermann T (2008) Taxonomy and definitions for sonification and auditory display. In: 14th international conference on auditory display, pp 1–8

    Google Scholar 

  • Hermann T, Ritter H (1999) Listen to your data: model-based sonification for data analysis. In: Lasker GE (ed) Advances in intelligent computing and multimedia systems. The International Institute for Advanced Studies in System Research and Cybernetics, Baden-Baden, pp 189–194

    Google Scholar 

  • Hermann T, Hunt A, Neuhoff J (2011) The sonification handbook. Logos Verlag, Berlin

    Google Scholar 

  • Jokinen K (2008) User interaction in mobile navigation applications. In: Meng L, Zipf A, Winter S (eds) Map-based mobile services. Lecture Notes in geoinformation and cartography. Springer, Berlin/Heidelberg, pp 168–197

    Chapter  Google Scholar 

  • Kaye J (2004) Making scents: aromatic output for HCI. Interactions 11:48–61. doi:http://doi.acm.org/10.1145/962342.964333

  • Lederman S (2004) Haptic identification of common objects: effects of constraining the manual exploration process. Percept Psychophys 66(4):618–628

    Article  Google Scholar 

  • Lemmens P, Bussemakers M, De Haan A (2001) Effects of auditory icons and earcons on visual categorization: the bigger picture. In: Proceedings of the international conference on auditory display (ICAD’01), Helsinki, pp 117–125

    Google Scholar 

  • Maclean K, Enriquez M (2003) Perceptual design of haptic icons. In: Proceedings of eurohaptics 2003, pp 351–363

    Google Scholar 

  • McGee M (2002) Investigating a multimodal solution for improving force feedback generated textures. Ph.D. thesis, University of Glasgow

    Google Scholar 

  • Mustonen M (2008) A review-based conceptual analysis of auditory signs and their design. In: Proceeding of ICAD

    Google Scholar 

  • Pinto M, Cavallo V, Ohimann T, Espie S, Roge J (2004) The perception of longitudinal accelerations: what factors influence braking manoeuvres in driving simulators? In: Conférence simulation de conduite, pp 139–151

    Google Scholar 

  • Pirhonen A (2006) Non-speech sounds as elements of a use scenario: a semiotic perspective. In: Proceedings of the 12th international conference on auditory display (ICAD2006, London

    Google Scholar 

  • Riecke BE, Schulte-Pelkum J, Caniard F, Bulthoff HH (2005) Towards lean and elegant self-motion simulation in virtual reality. In: Proceedings of virtual reality, 2005 (VR 2005). IEEE, pp 131–138

    Google Scholar 

  • Shneiderman B (1998) Designing the user interface: strategies for effective human-computer interaction, 3rd edn. Addison-Wesley, Reading

    Google Scholar 

  • Smith D (1975) Pygmalion: a computer program to model and stimulate creative thought. Ph.D. thesis, Stanford University, Department of Computer Science

    Google Scholar 

  • Sun R, Merrill E, Peterson T (2001) From implicit skills to explicit knowledge: a bottom-up model of skill learning. Cognit Sci 25(2):203–244

    Article  Google Scholar 

  • Todorov E (2004) Optimality principles in sensorimotor control. Nat Neurosci 7(9):907–915

    Article  Google Scholar 

  • Vernier F, Nigay L (2001) A framework for the combination and characterization of output modalities. In: Palanque P, Fabio P (eds) Interactive systems design, specification, and verification. Lecture notes in computer science, vol 1946. Springer, Berlin/Heidelberg, pp 35–50

    Chapter  Google Scholar 

  • Voisin J, Lamarre Y, Chapman C (2002) Haptic discrimination of object shape in humans: contribution of cutaneous and proprioceptive inputs. Exp Brain Res 145(2):251–260

    Article  Google Scholar 

  • Wolpert DM, Ghahramani Z (2000) Computational principles of movement neuroscience. Nat Neurosci 3:1212–1217

    Article  Google Scholar 

  • Wolpert DM, Kawato M (1998) Multiple paired forward and inverse models for motor control. Neural Netw 11(7):1317–1329

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Baranyi, P., Csapo, A., Sallai, G. (2015). Structure of Sensory Signals: Icons and Messages. In: Cognitive Infocommunications (CogInfoCom). Springer, Cham. https://doi.org/10.1007/978-3-319-19608-4_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-19608-4_7

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-19607-7

  • Online ISBN: 978-3-319-19608-4

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics