Live Demonstrator of EEG and Eye-Tracking Input for Disambiguation of Image Search Results

  • Jan-Eike Golenia
  • Markus Wenzel
  • Benjamin Blankertz
Conference paper

DOI: 10.1007/978-3-319-24917-9_8

Part of the Lecture Notes in Computer Science book series (LNCS, volume 9359)
Cite this paper as:
Golenia JE., Wenzel M., Blankertz B. (2015) Live Demonstrator of EEG and Eye-Tracking Input for Disambiguation of Image Search Results. In: Blankertz B., Jacucci G., Gamberini L., Spagnolli A., Freeman J. (eds) Symbiotic Interaction. Symbiotic 2015. Lecture Notes in Computer Science, vol 9359. Springer, Cham

Abstract

When searching images on the web, users are often confronted with irrelevant results due to ambiguous queries. Consider a search term like ’Bill’: Results will probably consist of multiple images depicting Bill Clinton, Bill Cosby and money bills. Given that the user is only interested in pictures of money bills, most of the results are irrelevant. We built a demo application that exploits EEG and eye-tracking data for the disambiguation of one of two possible interpretations of an ambiguous search term. The demo exhibits the integration of sensor input into a modern web application.

Keywords

Human-computer interaction Brain-computer interface EEG Eye-tracking Information retrieval Free viewing ERP Multivariate decoding 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Jan-Eike Golenia
    • 1
  • Markus Wenzel
    • 1
  • Benjamin Blankertz
    • 1
  1. 1.Neurotechnology GroupTechnische Universität BerlinBerlinGermany

Personalised recommendations