Skip to main content
Log in

An implicit relevance feedback method for CBIR with real-time eye tracking

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Relevance feedback is an efficient approach to improve the performance of content-based image retrieval systems, and implicit relevance feedback approaches, which gather users’ feedback by biometric devices (e.g. eye tracker), have extensively investigated in recent years. This paper proposes a novel image retrieval system with implicit relevance feedback, named eye tracking based relevance feedback system (ETRFs). ETRFs is composed of three main modules: image retrieval subsystem based on bag-of-word architecture; user relevance assessment that implicitly acquires relevant images with the help of a modern eye tracker; and relevance feedback module that applies a weighted query expansion method to fuse users’ relevance feedback. ETRFs is implemented online and real-time, which makes it remarkably distinguish from other offline systems. Ten subjects participate our experiments on the dataset of Oxford buildings and UKBench. The experimental results demonstrate that ETRFs achieves notable improvement for image retrieval performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Arandjelovic R, Zisserman A (2012) Three things everyone should know to improve object retrieval. In: IEEE conference on computer vision and pattern recognition. IEEE, pp 2911–2918

  2. Arapakis I, Athanasakos K, Jose JM (2010) A comparison of general vs personalised affective models for the prediction of topical relevance. In: Proceedings of the 33rd international ACM SIGIR. ACM, pp 371–378

  3. Arapakis I, Jose JM, Gray PD (2008) Affective feedback: an investigation into the role of emotions in the information seeking process. In: Proceedings of the 31st annual international ACM SIGIR. ACM, pp 395–402

  4. Arapakis I, Konstas I, Jose JM (2009) Using facial expressions and peripheral physiological signals as implicit indicators of topical relevance. In: Proceedings of the 17th ACM international conference on multimedia. ACM, pp 461–470

  5. Baeza-Yates R, Ribeiro-Neto B, et al. (1999) Modern Information Retrieval, vol. 463. ACM press, New York

    Google Scholar 

  6. Bao BK, Liu G, Xu C, Yan S (2012) Inductive robust principal component analysis. IEEE Trans Image Process 21(8):3794–3800

    Article  MathSciNet  Google Scholar 

  7. Bao BK, Zhu G, Shen J, Yan S (2013) Robust image analysis with sparse representation on quantized visual features. IEEE Trans Image Process 22(3):860–871

    Article  MathSciNet  Google Scholar 

  8. Buscher G, Dengel A, van Elst L (2008) Query expansion using gaze-based feedback on the subdocument level. In: The 31st annual international ACM SIGIR. ACM, pp 387–394

  9. Chum O, Mikulik A, Perdoch M, Matas J (2011) Total recall ii: Query expansion revisited. In: IEEE conference on computer vision and pattern recognition (CVPR). IEEE, pp 889–896

  10. Chum O, Philbin J, Sivic J, Isard M, Zisserman A (2007) Total recall: automatic query expansion with a generative feature model for object retrieval. In: IEEE 11th international conference on computer vision, 2007. ICCV 2007. IEEE, pp 1–8

  11. Cole MJ, Gwizdka J, Liu C, Belkin NJ, Zhang X (2013) Inferring user knowledge level from eye movement patterns. Inf Process Manag 49(5):1075–1091

    Article  Google Scholar 

  12. Datta R, Joshi D, Li J, Wang JZ (2008) Image retrieval: ideas, influences, and trends of the new age. ACM Comput Surv 40(2):1–60

    Article  Google Scholar 

  13. Faro A, Giordano D, Pino C, Spampinato C (2010) Visual attention for implicit relevance feedback in a content based image retrieval. In: Proceedings of the 2010 symposium on eye-tracking research & applications. ACM, pp 73–76

  14. Granka LA, Joachims T, Gay G (2004) Eye-tracking analysis of user behavior in www search. In: 27th annual international ACM SIGIR. ACM, pp 478–479

  15. Hajimirza SN, Proulx MJ, Izquierdo E (2012) Reading users’ minds from their eyes: a method for implicit image annotation. IEEE Trans Multimed 14(3):805–815

    Article  Google Scholar 

  16. Hardoon DR, Pasupa K (2010) Image ranking with implicit feedback from eye movements. In: 2010 symposium on eye-tracking research & applications. ACM, pp 291–298

  17. Hardoon DR, Shawe-Taylor J, Ajanki A, Puolamäki K, Kaski S (2007) Information retrieval by inferring implicit queries from eye movements. In: International conference on artificial intelligence and statistics. pp 179–186

  18. Hughes A, Wilkens T, Wildemuth BM, Marchionini G (2003) Text or pictures? An eyetracking study of how people view digital video surrogates. In: International conference on image and video retrieval. Springer, pp 271–280

  19. Joachims T, Granka L, Pan B, Hembrooke H, Gay G (2005) Accurately interpreting clickthrough data as implicit feedback. In: Proceedings of the 28th annual international ACM SIGIR. ACM, pp 154–161

  20. Kay KN, Naselaris T, Prenger RJ, Gallant JL (2008) Identifying natural images from human brain activity. Nature 452(7185):352–355

    Article  Google Scholar 

  21. Kelly D., Belkin NJ (2004) Display time as implicit feedback: understanding task effects. In: Proceedings of the 27th annual international ACM SIGIR. ACM, pp 377–384

  22. Kelly D., Teevan J (2003) Implicit feedback for inferring user preference: a bibliography. In: ACM SIGIR forum, vol 37. ACM, pp 18–28

  23. Klami A, Saunders C, de Campos TE, Kaski S (2008). ACM, pp 134–140

  24. Kowler E (2011) Eye movements: the past 25years. Vis Res 51(13):1457–1483

    Article  Google Scholar 

  25. Kozma L, Klami A, Kaski S (2009) Gazir: gaze-based zooming interface for image retrieval. In: Proceedings of the 2009 international conference on Multimodal interfaces. ACM, pp 305–312

  26. Liang Z, Fu H, Zhang Y, Chi Z, Feng D (2010) Content-based image retrieval using a combination of visual features and eye tracking data. In: Symposium on eye-tracking research & applications. ACM, pp 41–44

  27. Lowe DG (2004) Distinctive image features from scale-invariant keypoints. Int J Comput Vis 60(2):91–110

    Article  Google Scholar 

  28. Moshfeghi Y, Jose JM (2013) An effective implicit relevance feedback technique using affective, physiological and behavioural features. In: Proceedings of the 36th international ACM SIGIR. ACM, pp 133–142

  29. Nistér D., Stewénius H. (2006) Scalable recognition with a vocabulary tree. In: IEEE conference on computer vision and pattern recognition, vol 2, pp 2161–2168

  30. Oyekoya O., Stentiford F. (2004) Exploring human eye behaviour using a model of visual attention. In: 17th International conference on pattern recognition, vol 4. IEEE, pp 945–948

  31. Oyekoya O, Stentiford F (2004) Eye tracking as a new interface for image retrieval. BT Technol J 22(3):161–169

    Article  Google Scholar 

  32. Pantic M, Vinciarelli A (2009) Implicit human-centered tagging social sciences. IEEE Signal Proc Mag 26(6):173–180

    Article  Google Scholar 

  33. Papadopoulos G, Apostolakis K, Daras P (2014) Gaze-based relevance feedback for realizing region-based image retrieval. IEEE Trans Multimed 16(2):440–454

    Article  Google Scholar 

  34. Philbin J, Chum O, Isard M, Sivic J, Zisserman A (2007) Object retrieval with large vocabularies and fast spatial matching. In: IEEE conference on computer vision and pattern recognition. IEEE, pp 1–8

  35. Puolamäki K, Salojärvi J, Savia E, Simola J, Kaski S (2005) Combining eye movements and collaborative filtering for proactive information retrieval. In: The 28th annual international ACM SIGIR. ACM, pp 146–153

  36. Qian M, Aguilar M, Zachery KN, Privitera C, Klein S, Carney T, Nolte LW (2009) Decision-level fusion of eeg and pupil features for single-trial visual detection analysis. IEEE Trans Biomed Eng 56(7):1929–1937

    Article  Google Scholar 

  37. Rayner K (1978) Eye movements in reading and information processing. Psychol Bull 85(3):618

    Article  MathSciNet  Google Scholar 

  38. Rui Y, Huang TS, Chang SF (1999) Image retrieval: current techniques, promising directions, and open issues. J Vis Commun Image Represent 10(1):39–62

    Article  Google Scholar 

  39. Rui Y, Huang TS, Ortega M, Mehrotra S (1998) Relevance feedback: a power tool for interactive content-based image retrieval. IEEE T Circuits Syst Video Tech 8(5):644–655

    Article  Google Scholar 

  40. Shenoy P, Tan DS (2008) Human-aided computing: utilizing implicit human processing to classify images. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 845–854

  41. Sivic J, Zisserman A (2003) Video google: a text retrieval approach to object matching in videos. In: IEEE international conference on computer vision. IEEE, pp 1470–1477

  42. Smeulders AW, Worring M, Santini S, Gupta A, Jain R (2000) Content-based image retrieval at the end of the early years. IEEE Transactions on Pattern Analysis and Machine Intelligence 22(12):1349–1380

    Article  Google Scholar 

  43. Vrochidis S., Patras I., Kompatsiaris I. (2011) An eye-tracking-based approach to facilitate interactive video search. In: Proceedings of the 1st ACM international conference on multimedia retrieval, vol 43. ACM

  44. Wang J, Pohlmeyer E, Hanna B, Jiang YG, Sajda P, Chang SF (2009) Brain state decoding for rapid image retrieval. In: Proceedings of the 17th ACM international conference on Multimedia. ACM, pp 945–954

  45. Zhai S (2003) What’s in the eyes for attentive input. Commun ACM 46(3):34–39

    Article  Google Scholar 

  46. Zhang Y, Fu H, Liang Z, Chi Z, Feng D (2010) Eye movement as an interaction mechanism for relevance feedback in a content-based image retrieval system. In: Proceedings of the 2010 symposium on eye-tracking research & applications. ACM, pp 37–40

  47. Zhang Y, Yang X, Mei T (2014) Image search reranking with query-dependent click-based relevance feedback. IEEE Trans Image Process 23(10):4448–4459

    Article  MathSciNet  Google Scholar 

  48. Zhou XS, Huang TS (2003) Relevance feedback in image retrieval: a comprehensive review. Multimedia Systems 8(6):536–544

    Article  Google Scholar 

Download references

Acknowledgments

This work is supported by National Nature Science Foundation of China (61105119), Fundamental Research Funds for the Central Universities (2014JBZ003), Beijing Natural Science Foundation (No.4142043), Beijing Higher Education Young Elite Teacher Project.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qingyong Li.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, Q., Tian, M., Liu, J. et al. An implicit relevance feedback method for CBIR with real-time eye tracking. Multimed Tools Appl 75, 2595–2611 (2016). https://doi.org/10.1007/s11042-015-2873-1

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-015-2873-1

Keywords

Navigation