Predicting Stimulus-Driven Attentional Selection Within Mobile Interfaces
Masciocchi and Still  suggested that biologically inspired computational saliency models could predict attentional deployment within webpages. Their stimuli were presented on a large desktop monitor. We explored whether a saliency model’s predictive performance can be applied to small mobile interface displays. We asked participants to free-view screenshots of NASA’s mobile application Playbook. The Itti et al.  saliency model was employed to produce the predictive stimulus-driven maps. The first six fixations were used to select values to form the saliency maps’ bins, which formed the observed distribution. This was compared to the shuffled distribution, which offers a very conservative chance comparison as it includes predictable spatial biases by using a within-subjects bootstrapping technique. The observed distribution values were higher than the shuffled distribution. This suggests that a saliency model was able to predict the deployment of attention within small mobile application interfaces.
KeywordsHuman-computer interaction Mobile interface Cognitive engineering Saliency model Visual search
The authors thank Steven Hillenius for providing access to the Playbook software through a demo site. In addition, we thank the Virginia Space Grant Consortium for financially supporting this research.
- 1.Masciocchi, C.M., Still, J.D.: Alternative to eye tracking for predicting stimulus-driven attentional selection within interfaces. J. Hum.-Comput. Interact. 34, 285–301 (2013)Google Scholar
- 4.Wolfe, J.M.: Guided Search 4.0: Current Progress with a Model of Visual Search. In: Gray, W. (eds.) Integrated Models of Cognitive Systems, pp. 99–119, New York, Oxford (2007)Google Scholar
- 7.Still, J.D., Masciocchi, C.M.: A saliency model predicts fixations in web interfaces. In: 5th International Workshop on Model Driven Development of Advanced User Interfaces Computer-Human Interaction Conference, pp. 25–29 (2010)Google Scholar
- 11.Lim, J.J., Feria, C.: Visual search on a mobile device while walking. In: Mobile HCI Conference, pp. 295–304, San Francisco, CA (2012)Google Scholar
- 13.Marquez, J.J., Pyrzak, G., Hashemi, S., Ahmed, S., McMillin, K., Medwid, J., Chen, D., Esten, H.: Supporting real-time operations and execution through timeline and scheduling aids. In: 43rd International Conference on Environmental Systems, pp. 1–11. ICES (2013)Google Scholar
- 14.Harel, J., Koch, C., Perona, P.: Graph-based visual saliency. In: NIPS, vol. 1, pp. 5–12 (2006)Google Scholar