Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

An exploratory study of multimodal interaction modeling based on neural computation

基于神经计算的多通道交互模型的探索性研究

  • 74 Accesses

  • 1 Citations

Abstract

Multimodal interaction serves an important role in human-computer interaction. In this paper we propose a multimodal interaction model based on the latest cognitive research findings. The proposed model combines two proven neural computations, and helps to reveal the enhancement or depression influence of multimodal presentation upon the corresponding interaction task performance. A set of experiments is designed and conducted within the constraints of the model, which demonstrates the observed performance enhancement and depression effects. Our exploration and the experimental results help to further solve the question about how tactile feedback signal contribute the multimodal interaction efficiency which could provide guidelines for designing the tactile feedback in multimodal interaction.

摘要

创新点

多通道交互在人机交互中具有重要的作用。 本文基于最新的认知研究成果, 提出了一种多通道交互模型。 该模型把两个已被证明的神经计算相结合, 用于揭示不同的多通道呈现对相应的交互任务绩效所产生的增强或抑制效果。 本文在该模型的适用约束下设计并实现了一组实验, 实验得出观察到的绩效增强效应和观察到的绩效抑制效应。 本文的探索思路和实验结果有助于进一步解决触觉反馈信息对多通道交互效率的贡献问题, 从而为多通道交互中触觉反馈的设计提供指导。

This is a preview of subscription content, log in to check access.

References

  1. 1

    Jaimes A, Sebe N. Multimodal human-computer interaction: a survey. Comput Vis Image Und, 2007, 108: 116–134

  2. 2

    Li M Z, Dai G Z, Dong S H. Software model and interaction algorithm of multimodal interface (in Chinese). Chinese J Comput, 1998, 21: 111–118

  3. 3

    Salvendy G. Handbook of Human Factors and Ergonomics. Hoboken: John Wiley & Sons, 2012

  4. 4

    Ware C. Information Visualization: Perception for Design. San Francisco: Morgan Kaufmann Publishers Incorporated, 2013

  5. 5

    Shneiderman B, Plaisant C, Cohen M, et al. Designing the User Interface: Strategies for Effective Human-Computer Interaction. 5th ed. Upper Saddle River: Addison-Wesley, 2009

  6. 6

    Wang H A, Ma C X. Interactive multi-scale structures for summarizing video content. Sci China Inf Sci, 2013, 56: 052108

  7. 7

    Gallace A, Spence C. In Touch With the Future: the Sense of Touch From Cognitive Neuroscience to Virtual Reality. Oxford: Oxford University Press, 2014

  8. 8

    Gallace A, Tan Z H, Spence C. The body surface as a communication system: the state of the art after 50 years. Presence, 2007, 16: 655–676

  9. 9

    Bau O, Poupyrev I, Israr A, et al. TeslaTouch: electrovibration for touch surfaces. In: Proceedings of the 23rd Annual ACM Symposium on User Interface Software and Technology. New York: ACM Press, 2010. 283–292

  10. 10

    McGrath B J, Estrada A, Braithwaite M B, et al. Tactile Situation Awareness System Flight Demonstration. Army Aeromedical Resaerch Lab Fort Rucker AL Technical Report USAARL 2004-10. 2004

  11. 11

    Lu L, Tian F, Dai G Z, et al. A study of the multimodal congnition and interaction based on touch, audition and vision (in Chinese). J Comput Aided Design Comput Graph, 2014, 26: 654–661

  12. 12

    McGurk H, MacDonald J. Hearing lips and seeing voices. Nature, 1976, 264, 746–748

  13. 13

    Mayer R E. Multimedia learning. Psychol Learn Motiv, 2002, 41: 85–139

  14. 14

    Kieras D E, Meyer D E. An overview of the EPIC architecture for cognition and performance with application to human-computer interaction. Human-Comput Interact, 1997, 12: 391–438

  15. 15

    Ohshiro T, Angelaki D E, de Angelis G C. A normalization model of multisensory integration. Nature Neurosci, 2011, 14: 775–782

  16. 16

    Carandini M. From circuits to behavior: a bridge too far? Nature Neurosci, 2012, 15: 507–509

  17. 17

    Vroomen J, Keetels M. Perception of intersensory synchrony: a tutorial review. Atte Perce Psycho, 2010, 72: 871–884

  18. 18

    Calvert G A, Spence C, Stein B E. The Handbook of Multisensory Processes. Cambridge: MIT Press, 2004

  19. 19

    Ernst M O, Bülthoff H H. Merging the senses into a robust percept. Trends Cogn Sci, 2004, 8: 162–169

  20. 20

    Stein B E, Stanford T R, Rowland B A. The neural basis of multisensory integration in the midbrain: its organization and maturation. Hearing Res, 2009, 258: 4–15

  21. 21

    Carandini M, Heeger D J. Normalization as a canonical neural computation. Nature Rev Neurosci, 2012, 13: 51–62

  22. 22

    Stein B E, Stanford T R. Multisensory integration: current issues from the perspective of the single neuron. Nature Rev Neurosci, 2008, 9: 255–266

  23. 23

    Molholm S, Ritter W, Murray M M, et al. Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping study. Cogn Brain Res, 2002, 14: 115–128

  24. 24

    Baranek G T. Efficacy of sensory and motor interventions for children with autism. J Autism Develop Disord, 2002, 32: 397–422

  25. 25

    Brown S, Shankar R, Smith K. Borderline personality disorder and sensory processing impairment. Progress Neurol Psychiat, 2009, 13: 10–16

  26. 26

    Stein B E, Meredith M A. The Merging of the Senses. Cambridge: MIT Press, 1993

  27. 27

    Meredith M A, Stein B E. Spatial determinants of multisensory integration in cat superior colliculus neurons. J Neurophysiol, 1996, 75: 1843–1857

  28. 28

    Meredith M A, Nemitz J W, Stein B E. Determinants of multisensory integration in superior colliculus neurons. I. temporal factors. J Neurosci, 1987, 7: 3215–3229

  29. 29

    Meredith M A, Stein B E. Interactions among converging sensory inputs in the superior colliculus. Science, 1983, 221: 389–391

  30. 30

    Spence C, Driver J. Crossmodal Space and Crossmodal Attention. Oxford: Oxford University Press, 2004

  31. 31

    Gillmeister H, Eimer M. Tactile enhancement of auditory detection and perceived loudness. Brain Res, 2007, 1160: 58–68

  32. 32

    Heeger D J. Normalization of cell responses in cat striate cortex. Visual Neurosci, 1992, 9: 181–197

  33. 33

    Körding K P, Wolpert D M. Bayesian integration in sensorimotor learning. Nature, 2004, 427: 244–247

  34. 34

    Ernst M O, Banks M S. Humans integrate visual and haptic information in a statistically optimal fashion. Nature, 2002, 415: 429–433

  35. 35

    Sternberg R. Cognitive Psychology. 3rd ed. Carolina: Wadsworth Publishing, 2003

Download references

Author information

Correspondence to Feng Tian.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Lu, L., Lyu, F., Tian, F. et al. An exploratory study of multimodal interaction modeling based on neural computation. Sci. China Inf. Sci. 59, 92106 (2016). https://doi.org/10.1007/s11432-016-5520-1

Download citation

Keywords

  • human-computer interaction
  • multimodal integration
  • interaction model
  • touch-included interaction
  • cognition
  • multisensory integration
  • neural computation
  • brain coding

关键词

  • 人机交互
  • 多通道整合
  • 交互模型
  • 包含触觉的交互
  • 认知
  • 多感官融合
  • 神经计算
  • 脑编码