Hierarchical Processing Model Based on Multi-modality Interaction Design

  • Aiguo LuEmail author
  • Bo Dong
  • Feiran Hu
Conference paper
Part of the Lecture Notes in Electrical Engineering book series (LNEE, volume 576)


The information warfare requires diverse operations and a lot of information, the traditional human factors engineering design methods are prone to problems of high load of human–computer interaction, and it also lacks of quantitative capability. This paper focuses on five modalities of voice, eye control, touch, brain control, and gestures, and according to the man–machine–environment cognitive decision model and the hierarchical processing model for each modality interaction in different applicable scenarios to build the multi-modality interaction layered processing model to enhance the naturalness and friendliness of command interaction.


Shipborne command and control system Multi-modality Human–computer interaction design 


  1. 1.
    Leite H, De SC, Costa T et al (2018) Analysis of user interaction with a brain-computer interface based on steady-state visually evoked potentials: case study of a game. Comput Intell Neurosci 2018(2):1–10CrossRefGoogle Scholar
  2. 2.
    Chen J, Cao WH, Zhao H (2008) Application research of TAO in shipborne command and control system. Comput Eng 34(11):237–238Google Scholar
  3. 3.
    Jia P, Xiao-Gang XU, Zheng WT et al (2013) Battlefield visualization research on shipborne command and control system. J Syst Simul 25(10):2355–2358Google Scholar
  4. 4.
    Hsu LY, Loew MH (2001) Fully automatic 3d feature-based registration of multi-modality medical images. Image Vis Comput 19(1):75–85CrossRefGoogle Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  1. 1.Wuhan Digital Engineering InstituteWuhanChina

Personalised recommendations