SliceType: fast gaze typing with a merging keyboard

  • Burak BenligirayEmail author
  • Cihan Topal
  • Cuneyt Akinlar
Original Paper


Jitter is an inevitable by-product of gaze detection. Because of this, gaze typing tends to be a slow and frustrating process. In this paper, we propose SliceType, a soft keyboard that is optimized for gaze input. Our main design objective is to use the screen area more efficiently by allocating a larger area to the target keys. We achieve this by determining the keys that will not be used for the next input, and allocating their space to the adjacent keys with a merging animation. Larger keys are faster to navigate towards, and easy to dwell on in the presence of eye tracking jitter. As a result, the user types faster and more comfortably. In addition, we employ a word completion scheme that complements gaze typing mechanics. A character and a related prediction is displayed at each key. Dwelling at a key enters the character, and double-dwelling enters the prediction. While dwelling on a key to enter a character, the user reads the related prediction effortlessly. The improvements provided by these features are quantified using the Fitts’ law. The performance of the proposed keyboard is compared with two other soft keyboards designed for gaze typing, Dasher and GazeTalk. Thirty seven novice users gaze-typed a piece of text using all three keyboards. The results of the experiment show that the proposed keyboard allows faster typing, and is more preferred by the users.


Soft keyboard On-screen keyboard Gaze typing Continuous input text entry Assistive technologies Eye tracking 


Supplementary material

Supplementary material 1 (mp4 35023 KB)


  1. 1.
    Al Faraj K, Mojahid M, Vigouroux N (2009) BigKey: a virtual keyboard for mobile devices. In: Proceedings international conference on human–computer interaction (HCI Int.), pp 3–10Google Scholar
  2. 2.
    Ashmore M, Duchowski AT, Shoemaker G (2005) Efficient eye pointing with a fisheye lens. In: Proceedings of graphics interface (GI), pp 203–210Google Scholar
  3. 3.
    Bates R, Istance H (2002) Zooming interfaces!: enhancing the performance of eye controlled pointing devices. In: Proceedings international conference on communication technology (ASSETS), pp 119–126 (2002)Google Scholar
  4. 4.
    Bickel S, Haider P, Scheffer T (2005) Predicting sentences using n-gram language models. In: Proceedings of the conference on human language technology and empirical methods in natural language processing, pp 193–200Google Scholar
  5. 5.
    Bulling A, Gellersen H (2010) Toward mobile eye-based human-computer interaction. IEEE Pervasive Comput 9(4):8–12CrossRefGoogle Scholar
  6. 6.
    Diaz-Tula A, Morimoto CH (2016) AugKey: increasing foveal throughput in eye typing with augmented keys. In: Proceedings conference on human factors in computing systems (CHI), pp 3533–3544Google Scholar
  7. 7.
    Fitts PM (1954) The information capacity of the human motor system in controlling the amplitude of movement. J Exp Psychol (JEP) 47(6):381CrossRefGoogle Scholar
  8. 8.
    Foster G, Langlais P, Lapalme G (2002) User-friendly text prediction for translators. In: Proceeding conference on empirical methods in natural language process. (EMNLP), pp 148–155Google Scholar
  9. 9.
    Francis G, Johnson E (2011) Speed-accuracy tradeoffs in specialized keyboards. Int J Human Comput Stud (IJHCS) 69(7–8):526–538CrossRefGoogle Scholar
  10. 10.
    Grabski K, Scheffer T (2004) Sentence completion. In: Proceedings international conference on research and development in information retrieval (SIGIR), pp 433–439 (2004)Google Scholar
  11. 11.
    Grauman K, Betke M, Lombardi J, Gips J, Bradski GR (2003) Communication via eye blinks and eyebrow raises: video-based human-computer interfaces. Univ Access Inform Soc (UAIS) 2(4):359–373CrossRefGoogle Scholar
  12. 12.
    Gunawardana A, Paek T, Meek C (2010) Usability guided key-target resizing for soft keyboards. In: Proceedings of the international conference on intelligent(IUI), pp. 111–118 (2010)Google Scholar
  13. 13.
    Hansen JP, Hansen DW, Johansen AS (2001) Bringing gaze-based interaction back to basics. In: Proceedings of the international conference on human–computer interaction (HCI Int), pp 325–329Google Scholar
  14. 14.
    Hansen JP, Johansen AS, Hansen DW, Itoh K, Mashino S (2003) Language technology in a predictive, restricted on-screen keyboard with ambiguous layout for severely disabled people. In: Proceedings of the EACL workshop on language modeling for text entry methodsGoogle Scholar
  15. 15.
    Huckauf A, Urbina MH (2008) Gazing with pEYEs: towards a universal input for various applications. In: Proceedings of the symposium on eye tracking research and applications (ETRA), pp 51–54Google Scholar
  16. 16.
    Isokoski P (2004) Performance of menu-augmented soft keyboards. In: Proceedings conference human factors in computing system (CHI), pp 423–430Google Scholar
  17. 17.
    Isokoski P, Raisamo R (2000) Device independent text input: a rationale and an example. In: Proceedings of the working conference on advanced visual interfaces, pp 76–83Google Scholar
  18. 18.
    Itoh K, Aoki H, Hansen JP (2006) A comparative usability study of two Japanese gaze typing systems. In: Proceedings of the symposium on eye tracking research and applications (ETRA), pp 59–66Google Scholar
  19. 19.
    Kurauchi A, Feng W, Joshi A, Morimoto C, Betke M (2016) EyeSwipe: Dwell-free text entry using gaze paths. In: Proceedings conference on human factors in computing system (CHI), pp 1952–1956Google Scholar
  20. 20.
    MacKenzie IS (1992) Fitts’ law as a research and design tool in human-computer interaction. Human Comput Interact 7(1):91–139CrossRefGoogle Scholar
  21. 21.
    MacKenzie IS, Ashtiani B (2011) BlinkWrite: efficient text entry using eye blinks. Univ Access Inform Soc (UAIS) 10(1):69–80CrossRefGoogle Scholar
  22. 22.
    MacKenzie IS, Buxton W (1992) Extending Fitts’ law to two-dimensional tasks. In: Proceedings conference on human factors in computing System (CHI), pp 219–226Google Scholar
  23. 23.
    MacKenzie IS, Soukoreff RW (2002) Text entry for mobile computing: models and methods, theory and practice. Human Comput Interact 17(2–3):147–198CrossRefGoogle Scholar
  24. 24.
    MacKenzie IS, Zhang SX (1999) The design and evaluation of a high-performance soft keyboard. In: Proceedings conference on human factors in computing system (CHI), pp 25–31Google Scholar
  25. 25.
    MacKenzie IS, Zhang SX, Soukoreff RW (1999) Text entry using soft keyboards. Behav Inform Technol 18(4):235–244CrossRefGoogle Scholar
  26. 26.
    MacKenzie IS, Zhang X (2008) Eye typing using word and letter prediction and a fixation algorithm. In: Proceedings of the symposium on eye tracking research and applications (ETRA), pp 55–58Google Scholar
  27. 27.
    Majaranta P, MacKenzie IS, Aula A, Räihä KJ (2003) Auditory and visual feedback during eye typing. In: Proceedings of CHI extended abstracts on human factors in computing system, pp 766–767Google Scholar
  28. 28.
    Majaranta P, Räihä KJ (2002) Twenty years of eye typing: systems and design issues. In: Proceedings of the symposium on eye tracking research and applications (ETRA), pp 15–22 (2002)Google Scholar
  29. 29.
    Manning CD, Schütze H (1999) Foundations of statistical natural language processing. MIT Press, CambridgezbMATHGoogle Scholar
  30. 30.
    Mott ME, Williams S, Wobbrock JO, Morris MR (2017) Improving dwell-based gaze typing with dynamic, cascading dwell times. In: Proceeding conference on human factors in computing system (CHI), pp 2558–2570Google Scholar
  31. 31.
    Panwar P, Sarcar S, Samanta D (2012) EyeBoard: a fast and accurate eye gaze-based text entry system. In: Proceeding international conference on intelligent human computer interaction (IHCI), pp 1–8 (2012)Google Scholar
  32. 32.
    Porta M, Turina M (2008) Eye-S: a full-screen input modality for pure eye-based communication. In: Proceedings of the symposium on eye tracking research and applications (ETRA), pp 27–34Google Scholar
  33. 33.
    Romano M, Paolino L, Tortora G, Vitiello G (2014) The tap and slide keyboard: a new interaction method for mobile device text entry. Int J Human Comput Interact (IJHCI) 30(12):935–945CrossRefGoogle Scholar
  34. 34.
    Sharma MK, Dey S, Saha PK, Samanta D (2010) Parameters effecting the predictive virtual keyboard. In: IEEE students’ technology symposium (TechSym), pp 268–275Google Scholar
  35. 35.
    Shein F, Hamann G, Brownlow N, Treviranus J, Milner M, Parnes P (1991)WiViK: a visual keyboard for Windows 3.0. In: Proceedings annual conference of the rehabilitation engineering, assistive technology society of North America (RESNA), pp 160–162Google Scholar
  36. 36.
    Topal C, Benligiray B, Akinlar C (2012) On the efficiency issues of virtual keyboard design. In: Proceeding IEEE international conference on virtual environments human-computer interfaces and measurement system (VECIMS), pp 38–42Google Scholar
  37. 37.
    Venkatagiri H (1999) Efficient keyboard layouts for sequential access in augmentative and alternative communication. Augment Altern Commun (AAC) 15(2):126–134CrossRefGoogle Scholar
  38. 38.
    Ward, DJ, Blackwell AF, MacKay DJ (2000) Dasher: a data entry interface using continuous gestures and language models. In: ACM symposium on user interface software and technology (UIST), pp 129–137Google Scholar
  39. 39.
    Wobbrock JO, Rubinstein J, Sawyer MW, Duchowski AT (2008) Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In: Proceedings symposium eye tracking research and applications (ETRA), pp 11–18Google Scholar
  40. 40.
    Yu C, Gu Y, Yang Z, Yi X, Luo H, Shi Y (2017) Tap, dwell or gesture? Exploring head-based text entry techniques for HMDs. In: Proceedings Conference on human factors in computing system (CHI), pp 4479–4488Google Scholar
  41. 41.
    Zhai S, Hunter M, Smith BA (2000) The metropolis keyboard: an exploration of quantitative techniques for virtual keyboard design. In: Proceeding ACM symposium on user interface software and technology (UIST), pp 119–128Google Scholar
  42. 42.
    Zhai S, Hunter M, Smith BA (2002) Performance optimization of virtual keyboards. Human Comput Interact 17(2–3):229–269CrossRefGoogle Scholar
  43. 43.
    Zhang Y, Chong MK, Müller J, Bulling A, Gellersen H (2015) Eye tracking for public displays in the wild. Personal Ubiquitous Comput 19(5–6):967–981CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Department of Electrical and Electronics EngineeringEskişehir Technical UniversityEskisehirTurkey
  2. 2.EskisehirTurkey

Personalised recommendations