Live Demonstrator of EEG and Eye-Tracking Input for Disambiguation of Image Search Results
When searching images on the web, users are often confronted with irrelevant results due to ambiguous queries. Consider a search term like ’Bill’: Results will probably consist of multiple images depicting Bill Clinton, Bill Cosby and money bills. Given that the user is only interested in pictures of money bills, most of the results are irrelevant. We built a demo application that exploits EEG and eye-tracking data for the disambiguation of one of two possible interpretations of an ambiguous search term. The demo exhibits the integration of sensor input into a modern web application.
KeywordsHuman-computer interaction Brain-computer interface EEG Eye-tracking Information retrieval Free viewing ERP Multivariate decoding
Unable to display preview. Download preview PDF.
- 5.Hardoon, D.R., Pasupa, K.: Image ranking with implicit feedback from eye movements. In: Proceedings of the 6th Biennial Symposium on Eye Tracking Research & Applications (ETRA), pp. 291–298. ACM, New York (2010)Google Scholar
- 6.Hajimirza, S.N., Proulx, M.J., Izquierdo, E.: Reading users’ minds from their eyes: a method for implicit image annotation. IEEE Transactions on Multimedia, 805–815 (2012). IEEE press, New YorkGoogle Scholar