Advertisement

Behavior Research Methods

, Volume 40, Issue 4, pp 1150–1162 | Cite as

OGAMA (Open Gaze and Mouse Analyzer): Open-source software designed to analyze eye and mouse movements in slideshow study designs

  • Adrian VoßkühlerEmail author
  • Volkhard Nordmeier
  • Lars Kuchinke
  • Arthur M. Jacobs
Article

Abstract

In the present article, a new software is introduced that allows the recording and analyzing of eye- and mouse-tracking data from slideshow-based experiments in parallel. The Open Gaze and Mouse Analyzer (OGAMA) is written in C#.NET and has been released as an open-source project. Its main features include slide-show design, the recording of gaze and mouse data, database-driven preprocessing and filtering of gaze and mouse data, the creation of attention maps, areas-of-interest definition, and replay. Eyetracking and/or presentation soft- and hardware recordings in ASCII format can be imported. Data output is provided that can be used directly with different statistical software packages. Because it is open source, one can easily adapt it to suit one’s needs.

Keywords

Fixation Duration Mouse Movement Representational Format Mouse Data Database Interface 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. Accot, J., & Zhai, S. (2003). Refining Fitts’ law models for bivariate pointing. In Proceedings of ACM CHI 2003 Conference on Human Factors in Computing Systems (pp. 193–200). New York: ACM Press.Google Scholar
  2. Byrne, M. D., Anderson, J. R., Douglass, S., & Matessa, M. (1999). Eye tracking the visual search of click-down menus. In Human factors in computing systems: Proceedings of CHI ’99 (pp. 402–409). New York: Addison-Wesley.Google Scholar
  3. Chen, M.-C., Anderson, J. R., & Sohn, M.-H. (2001). What can a mouse cursor tell us more? Correlation of eye/mouse movements on Web browsing. In CHI ’01 extended abstracts on human factors in computing systems (pp. 281–282). New York: ACM Press.CrossRefGoogle Scholar
  4. Duchowski, A. T. (2003). Eye tracking methodology. Theory and practice. London: Springer.Google Scholar
  5. Gitelman, D. R. (2002). ILAB: A program for postexperimental eye movement analysis. Behavioral Research Methods, Instruments, & Computers, 34, 605–612.CrossRefGoogle Scholar
  6. Gitelman, D. R., Parrish, T. B., Friston, K. J., & Mesulam, M.-M. (2002). Functional anatomy of visual search: Regional segregations within the frontal eye fields and effective connectivity of the superior colliculus. NeuroImage, 15, 970–982.PubMedCrossRefGoogle Scholar
  7. Goldberg, J. H., & Kotval, X. P. (1998). Eye movement-based evaluation of the computer interface. In S. K. Kumar (Ed.), Advances in occupational ergonomics and safety (pp. 529–532). Amsterdam: IOS Press.Google Scholar
  8. Hutzler, F., Kronbichler, M., Jacobs, A. M., & Wimmer, H. (2006). Perhaps correlational but not causal: No effect of dyslexic readers’ magnocellular system on their eye movements during reading. Neuropsychologia, 44, 637–648.PubMedCrossRefGoogle Scholar
  9. Karsh, R., & Breitenbach, F. W. (1983). Looking at looking: The amorphous fixation measure. In R. Groner, C. Menz, D. Fisher, & R. A. Monty (Eds.), Eye movements and psychological functions: International views (pp. 53–64). Hillsdale, NJ: Erlbaum.Google Scholar
  10. Kirstein, J., & Nordmeier, V. (2007). Multimedia representation of experiments in physics. European Journal of Physics, 28, S115-S126.CrossRefGoogle Scholar
  11. Lenzner, A., Müller, A., Horz, H., & Schnotz, W. (2008). Dekorative und instruktionale Bilder in Lerntexten des Physikunterrichtes [Decorative and instructive pictures in learning texts in physics education]. In V. Nordmeier (Ed.), Didaktik der Physik. Beiträge zur Frühjahrstagung der DPG—Berlin 2008. [CD zur Frühjahrstagung des Fachverbandes Didaktik der Physik in der Deutschen Physikalischen Gesellschaft]. Berlin: Lehmanns Media.Google Scholar
  12. Magnuson, J. (2005). Moving hand reveals dynamics of thought. Proceedings of the National Academy of Sciences, 102, 9995–9996.CrossRefGoogle Scholar
  13. Nielsen, J., (2004). Usability engineering. San Diego: Kaufmann.Google Scholar
  14. Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 124, 372–422.PubMedCrossRefGoogle Scholar
  15. Salvucci, D. D., & Goldberg, J. H. (2000). Identifying fixations and saccades in eye-tracking protocols. In A. T. Duchowski (Ed.), Proceedings of the 2000 symposium on eye tracking research and applications (pp. 71–78). New York: ACM Press.CrossRefGoogle Scholar
  16. Shneiderman, B., & Plaisant, C. (2004). Designing the user interface (4th ed.). Boston: Addison-Wesley.Google Scholar
  17. Spakov, O., & Miniotas, D. (2007). Visualization of eye gaze data using heat maps. Electronica & Electrical Engineering, 2, 55–58.CrossRefGoogle Scholar
  18. Tan, P. N., Steinbach, M., & Kumar, V. (2006). Introduction to data mining. Boston: Addison-Wesley.Google Scholar
  19. Unema, P., Pannasch, S., Joos, M., & Velichkovsky, B. M. (2005). Time-course of information processing during scene perception: The relationship between saccade amplitude and fixation duration. Visual Cognition, 12, 473–494.CrossRefGoogle Scholar

Copyright information

© Psychonomic Society, Inc. 2008

Authors and Affiliations

  • Adrian Voßkühler
    • 1
    Email author
  • Volkhard Nordmeier
    • 1
  • Lars Kuchinke
    • 1
  • Arthur M. Jacobs
    • 1
  1. 1.Institute of Physics EducationFreie Universität BerlinBerlinGermany

Personalised recommendations