Capturing You Watching You: Characterizing Visual-Motor Dynamics in Touchscreen Interactions
The relationship between where people look and where people reach has been studied since the dawn of experimental psychology. This relationship has implications for the designs of interactive visualizations, particularly for applications involving touchscreens. We present a new visual-motor analytics dashboard for the joint study of eye movement and hand/finger movement dynamics. Our modular approach combines real-time playback of gaze and finger-dragging behavior together with statistical models quantifying the dynamics of both modalities. To aid in visualization and inference with these data, we apply Gaussian process regression models which capture the similarities and differences between eye and finger movements, while providing a statistical model of the observed functional data. Smooth estimates of the dynamics are included in the dashboard to enable visual-analytic exploration of visual-motor behaviors on touchscreen interfaces.
KeywordsRadial Basis Function Gaussian Process Finger Movement Gaussian Process Regression Finger Position
The authors thank three anonymous reviewers for their feedback on this chapter. The views expressed in this paper are those of the authors and do not reflect the official policy or position of the Department of Defense or the U.S. Government. This work was supported in part by AFOSR LRIR to L.M.B. and AFOSR grant FA9550-13-1-0087 to J.W.H. Distribution A: Approved for public release; distribution unlimited. 88ABW Cleared 08/26/2015; 88ABW-2015-4098.
- 3.Blaha, L., Schill, M.T.: Modeling touch interactions on very large touchscreens. In: Proceedings of the 36th Annual Meeting of the Cognitive Science Society, Quebec City (2014)Google Scholar
- 8.Cöltekin, A., Demsar, U., Brychtová, A., Vandrol, J.: Eye-hand coordination during visual search on geographic displays. In: Proceedings of the 2nd International Workshop on Eye Tracking for Spatial Research (ET4S 2014). ACM, New York (2014)Google Scholar
- 10.Demšar, U., Çöltekin, A.: Quantifying the interactions between eye and mouse movements on spatial visual interfaces through trajectory visualisations. In: Workshop on analysis of movement data at GIScience, Vienna, pp. 23–26 (2014)Google Scholar
- 13.Franco-Watkins, A.M., Johnson, J.G.: Applying the decision moving window to risky choice: comparison of eye-tracking and mouse-tracing methods. Judgm. Decis. Making 6 (8), 740–749 (2011)Google Scholar
- 15.Han, J.Y.: Low-cost multi-touch sensing through frustrated total internal reflection. In: Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology, pp. 115–118. ACM, Seattle (2005)Google Scholar
- 17.ISO9241-400: Ergonomics of Human-System Interaction – Part 400: Principles and Requirements for Physical Input Devices, Geneva (2007)Google Scholar
- 18.Jagacinski, R.J., Flach, J.M.: Control Theory for Humans: Quantitative Approaches to Modeling Performance. CRC Press, Mahwah (2003)Google Scholar
- 20.Li, X., Cöltekin, A., Kraak, M.-J.: Visual exploration of eye movement data using the space-time-cube. In: Geographic Information Science, pp. 295–309. Springer, Berlin (2010)Google Scholar
- 25.Prablanc, C., Echallier, J., Jeannerod, M., Komilis, E.: Optimal response of eye and hand motor systems in pointing at a visual target. I. Spatio-temporal characteristics of eye and hand movements and their relationship when varying the amount of visual information. Biol. Cybern. 35 (3), 113–124 (1978)Google Scholar
- 26.R Development Core Team: R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna (2011)Google Scholar
- 30.Soukoreff, R.W., MacKenzie, I.S.: Towards a standard for pointing device evaluation: perspectives on 27 years of Fitts’ law research in HCI. Int. J. Hum. Comput. Stud. 61 (6), 751–789 (2004)Google Scholar
- 33.Woodworth, R.S.: Accuracy of voluntary movement. Psychol. Rev. Monogr. Suppl. 3 (3), i (1899)Google Scholar