Advertisement

User-Driven Intelligent Interface on the Basis of Multimodal Augmented Reality and Brain-Computer Interaction for People with Functional Disabilities

  • Peng Gang
  • Jiang Hui
  • S. Stirenko
  • Yu. Gordienko
  • T. Shemsedinov
  • O. Alienin
  • Yu. Kochura
  • N. Gordienko
  • A. Rojbi
  • J. R. López Benito
  • E. Artetxe González
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 886)

Abstract

The analysis of the current integration attempts of some modes and use cases of human-to-machine interaction is presented. The new concept of the user-driven intelligent interface for accessibility is proposed on the basis of multimodal augmented reality and brain-computer interaction for various applications: in disabilities studies, education, home care, health care, eHealth, etc. The several use cases of multimodal augmentation are presented. The perspectives of the better human comprehension by the immediate feedback through neurophysical channels by means of brain-computer interaction are outlined. It is shown that brain–computer interface (BCI) technology provides new strategies to overcome limits of the currently available user interfaces, especially for people with functional disabilities. The results of the previous studies of the low end consumer and open-source BCI-devices allow us to conclude that combination of machine learning (ML), multimodal interactions (visual, sound, tactile) with BCI will profit from the immediate feedback from the actual neurophysical reactions classified by ML methods. In general, BCI in combination with other modes of AR interaction can deliver much more information than these types of interaction themselves. Even in the current state the combined AR-BCI interfaces could provide the highly adaptable and personal services, especially for people with functional disabilities.

Keywords

Augmented reality Interfaces for accessibility Multimodal user interface Brain-computer interface eHealth Machine learning Human-to-machine interactions 

Notes

Acknowledgment

The work was partially supported by Ukraine-France Collaboration Project (Programme PHC DNIPRO) (http://www.campusfrance.org/fr/dnipro), Twinning Grant by EU IncoNet EaP project (http://www.inco-eap.net/), and by Huizhou Science and Technology Bureau and Huizhou University (Huizhou, P. R. China) in the framework of Platform Construction for China-Ukraine Hi-Tech Park Project # 2014C050012001

References

  1. 1.
    Hartzler, A.L., Osterhage, K., Demiris, G., Phelan, E.A., Thielke, S.M., Turner, A.M.: Understanding views on everyday use of personal health information: insights from community dwelling older adults. Inf. Health Soc. Care, 1–14 (2017)Google Scholar
  2. 2.
    Ziefle, M., Rocker, C., Holzinger, A.: Medical technology in smart homes: exploring the user’s perspective on privacy, intimacy and trust. In: Proceedings of the IEEE 35th Annual Computer Software and Applications Conference Workshops, pp. 410–415 (2011)Google Scholar
  3. 3.
    Dimitrova, R.: Growth in the intersection of eHealth and active and healthy ageing. Technol. Health Care 21(2), 169–172 (2013)Google Scholar
  4. 4.
    Billinghurst, M., Clark, A., Lee, G.: A survey of augmented reality. Found. Trends Hum. Comput. Interact. 8(2–3), 73–272 (2015)CrossRefGoogle Scholar
  5. 5.
    Dahl, D. (ed.): Multimodal Interaction with W3C Standards: Toward Natural User Interfaces to Everything. Springer, Cham (2017)Google Scholar
  6. 6.
    Hassanien, A.E., Azar, A.T.: Brain-Computer Interfaces. Springer, Cham (2015)zbMATHGoogle Scholar
  7. 7.
    Soh, P.J., Woo, W.L., Sulaiman, H.A., Othman, M.A., Saat, M.S. (eds.): Advances in Machine Learning and Signal Processing: Proceedings of MALSIP 2015, vol. 387. Springer, Cham (2016)Google Scholar
  8. 8.
    Barfield, W. (ed.): Fundamentals of Wearable Computers and Augmented Reality. CRC Press, New York (2015)Google Scholar
  9. 9.
    Wolpaw, J.R., Birbaumer, N., McFarland, D.J., Pfurtscheller, G., Vaughan, T.M.: Brain–computer interfaces for communication and control. Clin. Neurophysiol. 113(6), 767–791 (2002)CrossRefGoogle Scholar
  10. 10.
    Niedermeyer, E., da Silva, F.L. (eds.): Electroencephalography: Basic Principles, Clinical Applications, and Related Fields. Lippincott Williams & Wilkins (2005)Google Scholar
  11. 11.
    Cacioppo, J.T., Tassinary, L.G., Berntson, G. (eds.): Handbook of Psychophysiology. Cambridge University Press, Cambridge (2007)Google Scholar
  12. 12.
    Nacke, L.E.: An introduction to physiological player metrics for evaluating games. In: Game Analytics, pp. 585–619. Springer, London (2013)CrossRefGoogle Scholar
  13. 13.
    Bouhlel, N., Rojbi, A.: New tools for automating tactile geographic map translation. In: Proceedings of the 16th International ACM SIGACCESS Conference on Computers & Accessibility, pp. 313–314 (2014)Google Scholar
  14. 14.
    Belhabib, N., Rojbi, A.: Conception d’un dispositif de pointage-navigation accessible et adaptatif pour plusieurs cas d’handicap moteur (2010). http://www2.univ-paris8.fr/ingenierie-cognition/master-handi/recherche/handicap-dern-ver.pdf
  15. 15.
    Rojbi, A., Schmitt, J.-C.: Method for the Locating and Fuzzy Segmentation of a Person in a Video Image, WO 2005/071612 (2005). International Patent of InventionGoogle Scholar
  16. 16.
    Rojbi, A.: Fuzzy global segmentation system for video-telephony sequences. In: 8th World Multiconference on Systemics, Cybermics, Cybernetics and Informatics. Invited Sessions in Color Image Processing & Applications, USA, Florida (2006)Google Scholar
  17. 17.
    Bouhlel, N., Coron, A., Barrois, G., Lucidarme, O., Bridal, S.L.: Dual-mode registration of dynamic contrast-enhanced ultrasound combining tissue and contrast sequences. Ultrasonics 54, 1289–1299 (2014)CrossRefGoogle Scholar
  18. 18.
    Artetxe González, E., Souvestre, F., López Benito, J.R.: Augmented reality interface for E2LP: assistance in electronic laboratories through augmented reality. In: Embedded Engineering Education. Advances in Intelligent Systems and Computing, vol. 421, Chap. 6. Springer, Cham (2016)CrossRefGoogle Scholar
  19. 19.
    Kastelan, I., Lopez Benito, J.R., Artetxe Gonzalez, E., Piwinski, J., Barak, M., Temerinac, M.: E2LP: a unified embedded engineering learning platform. Microprocess. Microsyst. Part B 38(8), 933–946 (2014)CrossRefGoogle Scholar
  20. 20.
    Meng, Y., Kim H.C.: A review of accelerometer based physical activity measurement. In: Kim, K.J., Ahn, S.J. (eds.) Proceedings of the International Conference on IT Convergence and Security 2011. Lecture Notes in Electrical Engineering, vol. 120. Springer, Dordrecht (2012)Google Scholar
  21. 21.
    Clark, C.C., Barnes, C.M., Stratton, G., McNarry, M.A., Mackintosh, K.A., Summers, H.D.: A review of emerging analytical techniques for objective physical activity measurement in humans. Sports Med. 47(3), 439–447 (2017)CrossRefGoogle Scholar
  22. 22.
    Gordienko, N., Lodygensky, O., Fedak, G., Gordienko, Y.: Synergy of volunteer measurements and volunteer computing for effective data collecting, processing, simulating and analyzing on a worldwide scale. In: Proceedings of the IEEE 38th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), pp. 193–198 (2015)Google Scholar
  23. 23.
    Gordienko, N.: Multi-parametric statistical method for estimation of accumulated fatigue by sensors in ordinary gadgets. In: Proceedings of the International Conference on “Science in XXI Century: Current Problems in Physics”, 17–19 May 2016, Kyiv, Ukraine (2016). arXiv preprint: arXiv:1605.04984
  24. 24.
    Gordienko, Yu., Stirenko, S., Alienin, O., Skala, K., Soyat, Z., Rojbi, A., López Benito, J.R., Artetxe González, E., Lushchyk, U., Sajn, L., Llorente Coto, A., Jervan G.: Augmented coaching ecosystem for non-obtrusive adaptive personalized elderly care on the basis of cloud-fog-dew computing paradigm. In: Proceedings of the IEEE 40th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), pp. 387–392 (2017)Google Scholar
  25. 25.
    Gordienko, N., Stirenko, S., Kochura, Yu., Rojbi, A., Alienin, O., Novotarskiy, M., Gordienko, Yu.: Deep learning for fatigue estimation on the basis of multimodal human-machine interactions. In: Proceedings of XXIX IUPAP Conference in Computational Physics (CCP 2017) (2017)Google Scholar
  26. 26.
    Kochura, Yu., Stirenko, S., Alienin, O., Novotarskiy, M., Gordienko, Yu.: Performance analysis of open source machine learning frameworks for various parameters in single-threaded and multi-threaded modes. In: Conference on Computer Science and Information Technologies, pp. 243–256. Springer, Cham (2017)Google Scholar
  27. 27.
    Kochura, Yu., et al.: Data Augmentation for Semantic Segmentation (2018, submitted)Google Scholar
  28. 28.
    Hamotskyi, S., Rojbi, A., Stirenko, S., Gordienko, Yu.: Automatized generation of alphabets of symbols. In: Proceedings of the IEEE 2017 Federated Conference on Computer Science and Information Systems (FedCSIS 2017), Prague, Czech Republic, pp. 639–642, September 2017. arXiv preprint: arXiv:1707.04935
  29. 29.
    Hamotskyi, S., Stirenko, S., Gordienko, Yu., Rojbi, A.: Generating and estimating nonverbal alphabets for situated and multimodal communications. Int. J. Syst. Appl. Eng. Dev. 11, 232–236 (2017). arXiv preprint: arXiv:1712.04314Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Peng Gang
    • 1
  • Jiang Hui
    • 1
  • S. Stirenko
    • 2
  • Yu. Gordienko
    • 2
  • T. Shemsedinov
    • 2
  • O. Alienin
    • 2
  • Yu. Kochura
    • 2
  • N. Gordienko
    • 2
  • A. Rojbi
    • 3
  • J. R. López Benito
    • 4
  • E. Artetxe González
    • 4
  1. 1.Huizhou UniversityHuizhou CityChina
  2. 2.National Technical University of Ukraine, “Igor Sikorsky Kyiv Polytechnic Institute”KievUkraine
  3. 3.CHArt Laboratory (Human and Artificial Cognitions)University of Paris 8ParisFrance
  4. 4.CreativiTIC Innova SLLogroñoSpain

Personalised recommendations