Gazing the Text for Fun and Profit



Reading digital books is becoming more and more common, and modern interface technologies offer a wide range of methods to interact with the user. However, few of them have been researched with respect to their impact on the reading experience. With a special focus on eye tracking devices we investigate how novel text interaction concepts can be created. We present an analysis of the eyeBook, which provides real time effects according to the read process. We then focus on multi-modal interaction and the usage of EEG devices for the recording of evoked emotions.


Speech Recognition Reading Experience Handwriting Recognition Text Node Emotion Interest 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



The authors like to thank Farida Ismail, who implemented many parts of the Emotional Text Tagging prototype and supervised the experiment. She also contributed to parts of Sect. 8.6.4. Our gratitude also goes to Mostfa El Hosseiny who put lots of effort into programming the eyePad demo. He also came up with the book’s multimodal story and put lots of energy into designing the demo and bringing the modern Tom Riddle to life.


  1. Anonymous (2007) To read or not to read: a question of national consequence. Tech. rep., National Endowment for the Arts, Washington.
  2. Biedert R, Buscher G, Dengel A (2010) The eyeBook—using eye tracking to enhance the reading experience. Inform-Spektrum 33(3):272–281. doi: 10.1007/s00287-009-0381-2 CrossRefGoogle Scholar
  3. Biedert R, Buscher G, Schwarz S, Hees J, Dengel A (2010) Text 2.0. In: Extended abstracts of the proceedings of the 28th international conference on human factors in computing systems (CHI EA’10), pp 4003–4008. Google Scholar
  4. Biedert R, Buscher G, Hees J, Dengel A (2012) A robust realtime reading-skimming classifier. In: Proceedings of the 2012 symposium on eye-tracking research & applications Google Scholar
  5. Buscher G (2010) Attention-based information retrieval. PhD thesis, University Kaiserslautern, Kaiserslautern Google Scholar
  6. Campbell A, Choudhury T, Hu S, Lu H, Mukerjee M, Rabbi M, Raizada R (2010) NeuroPhone: brain-mobile phone interface using a wireless EEG headset. In: Proceedings of the second ACM SIGCOMM workshop on networking, systems, and applications on mobile handhelds (MobiHeld’10). ACM, New York Google Scholar
  7. Eisenberg A (2011) Pointing with your eyes, to give the mouse a break.
  8. Ekanayake H (2010) P300 and Emotiv EPOC: does Emotiv EPOC capture real EEG?, pp 1–16.
  9. Hoffmann A (2010) EEG signal processing and Emotiv’s neuro headset. Technische Universität Darmstadt.
  10. Holzapfel H (2008) A dialogue manager for multimodal human-robot interaction and learning of a humanoid robot. Ind Robot 35(6):528–535. doi: 10.1108/01439910810909529 CrossRefGoogle Scholar
  11. Holzapfel H, Gieselmann P (2004) A way out of dead end situations in dialogue systems for human-robot interaction. In: 4th IEEE/RAS international conference on humanoid robots, vol 1, pp 184–195 CrossRefGoogle Scholar
  12. Hyrskykari A (2006) Eyes in attentive interfaces: experiences from creating iDict, a gaze-aware reading aid.
  13. Hyrskykari A, Majaranta P, Aaltonen A, Räihä KJ (2000) Design issues of iDICT: a gaze-assisted translation aid. In: Proceedings of the 2000 symposium on eye tracking research & applications (ETRA’00). doi: 10.1145/355017.355019 Google Scholar
  14. Jacob RJK (1995) Eye tracking in advanced interface design. In: Barfield W, Furness TA (eds) Virtual environments and advanced interface design. Oxford University Press, New York, pp 258–288 Google Scholar
  15. Kleinginna P, Kleinginna A (1981) A categorized list of emotion definitions, with suggestions for a consensual definition. Motiv Emot 5(4):345–379 CrossRefGoogle Scholar
  16. Miller CC, Bosman J (2011) E-books outsell print books at Amazon. New York Times, p B2.
  17. Rayner K (1998) Eye movements in reading and information processing: 20 years of research. Psychol Bull 124(3):372–422 CrossRefGoogle Scholar
  18. Starker I, Bolt RA (1990) A gaze-responsive self-disclosing display. In: Proceedings of the SIGCHI conference on human factors in computing systems: empowering people, pp 3–10 CrossRefGoogle Scholar
  19. Wade NJ, Tatler BW (2009) Did Javal measure eye movements during reading? J Eye Movement Res 5:5–7 Google Scholar

Copyright information

© Springer-Verlag London 2013

Authors and Affiliations

  1. 1.German Research Center for Artificial IntelligenceKaiserslauternGermany
  2. 2.Microsoft BingRedmondUSA

Personalised recommendations