Advertisement

Haptic Saliency Model for Rigid Textured Surfaces

  • Anna MetzgerEmail author
  • Matteo Toscani
  • Matteo Valsecchi
  • Knut Drewing
Conference paper
  • 1.4k Downloads
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10893)

Abstract

When touching an object, we focus more on some of its parts rather than touching the whole object’s surface, i.e. some parts are more salient than others. Here we investigated how different physical properties of rigid, plastic, relieved textures determine haptic exploratory behavior. We produced haptic stimuli whose textures were locally defined by random distributions of four independent features: amplitude, spatial frequency, orientation and isotropy. Participants explored two stimuli one after the other and in order to promote exploration we asked them to judge their similarity. We used a linear regression model to relate the features and their gradients to the exploratory behavior (spatial distribution of touch duration). The model predicts human behavior significantly better than chance, suggesting that exploratory movements are to some extent driven by the low level features we investigated. Remarkably, the contribution of each predictor changed as a function of the spatial scale in which it was defined, showing that haptic exploration preferences are spatially tuned, i.e. specific features are most salient at different spatial scales.

Keywords

Haptic salience Attention Haptic exploration 

References

  1. 1.
    Crick, F.: Function of the thalamic reticular complex: the searchlight hypothesis. Proc. Natl. Acad. Sci. 81, 4586–4590 (1984)CrossRefGoogle Scholar
  2. 2.
    Treisman, A.M., Gelade, G.: A feature-integration theory of attention. Cognit. Psychol. 12, 97–136 (1980)CrossRefGoogle Scholar
  3. 3.
    Treue, S.: Visual attention: the where, what, how and why of saliency. Curr. Opin. Neurobiol. 13, 428–432 (2003)CrossRefGoogle Scholar
  4. 4.
    Itti, L., Koch, C., Niebur, E.: A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. Pattern Anal. Mach. Intell. 20, 1254–1259 (1998)CrossRefGoogle Scholar
  5. 5.
    Itti, L., Koch, C.: A saliency-based search mechanism for overt and covert shifts of visual attention. Vis. Res. 40, 1489–1506 (2000)CrossRefGoogle Scholar
  6. 6.
    Itti, L., Koch, C.: Computational modelling of visual attention. Nat. Rev. Neurosci. 2, 194–203 (2001)CrossRefGoogle Scholar
  7. 7.
    Schütz, A.C., Braun, D.I., Gegenfurtner, K.R.: Eye movements and perception: a selective review. J. Vis. 11, 1–30 (2011)Google Scholar
  8. 8.
    Parkhurst, D., Law, K., Niebur, E.: Modeling the role of salience in the allocation of overt visual attention. Vision. Res. 42, 107–123 (2002)CrossRefGoogle Scholar
  9. 9.
    Oliva, A., Torralba, A., Castelhano, M.S., Henderson, J.M.: Top-down control of visual attention in object detection. Presented at the Proceedings 2003 International Conference on Image Processing, ICIP 2003 (2003)Google Scholar
  10. 10.
    Walther, D.B., Serre, T., Poggio, T., Koch, C.: Modeling feature sharing between object detection and top-down attention. J. Vis. 5 (2005). Article no. 1041Google Scholar
  11. 11.
    Foulsham, T., Underwood, G.: What can saliency models predict about eye movements? Spatial and sequential aspects of fixations during encoding and recognition. J. Vis. 8, 1–17 (2008)CrossRefGoogle Scholar
  12. 12.
    Einhäuser, W., Spain, M., Perona, P.: Objects predict fixations better than early saliency. J. Vis. 8, 1–26 (2008)Google Scholar
  13. 13.
    Masciocchi, C.M., Mihalas, S., Parkhurst, D., Niebur, E.: Everyone knows what is interesting: salient locations which should be fixated. J. Vis. 9, 1–22 (2009)CrossRefGoogle Scholar
  14. 14.
    Chikkerur, S., Serre, T., Tan, C., Poggio, T.: What and where: a Bayesian inference theory of attention. Vis. Res. 50, 2233–2247 (2010)CrossRefGoogle Scholar
  15. 15.
    Mahadevan, V., Vasconcelos, N.: Spatiotemporal saliency in dynamic scenes. IEEE Trans. Pattern Anal. Mach. Intell. 32, 171–177 (2010)CrossRefGoogle Scholar
  16. 16.
    Wolfe, J., Horowitz, T.S.: Visual search. Scholarpedia 3, 3325 (2008)CrossRefGoogle Scholar
  17. 17.
    Lederman, S.J., Klatzky, R.L.: Relative availability of surface and object properties during early haptic processing. J. Exp. Psychol. Hum. Percept. Perform. 23, 1680 (1997)CrossRefGoogle Scholar
  18. 18.
    Plaisier, M.A., Tiest, W.M.B., Kappers, A.M.: Haptic pop-out in a hand sweep. Acta Psychol. (Amst.) 128, 368–377 (2008)CrossRefGoogle Scholar
  19. 19.
    van Polanen, V., Bergmann Tiest, W.M., Kappers, A.M.: Haptic pop-out of movable stimuli. Atten. Percept. Psychophys. 74, 204–215 (2012)CrossRefGoogle Scholar
  20. 20.
    Grunwald, M., Muniyandi, M., Kim, H., Kim, J., Krause, F., Mueller, S., Srinivasan, M.A.: Human haptic perception is interrupted by explorative stops of milliseconds. Front. Psychol. 5, 292 (2014)CrossRefGoogle Scholar
  21. 21.
    Morash, V.S.: Systematic movements in haptic search: spirals, zigzags, and parallel sweeps. IEEE Trans. Haptics 9, 100–110 (2016)CrossRefGoogle Scholar
  22. 22.
    Sobel, I.: An isotropic 3 × 3 image gradient operator. Machine Vision for Three-Dimensional Scenes, pp. 376–379 (1990)Google Scholar
  23. 23.
    Hsiao, S.S., Lane, J., Fitzgerald, P.: Representation of orientation in the somatosensory system. Behav. Brain Res. 135, 93–103 (2002)CrossRefGoogle Scholar
  24. 24.
    Johansson, R.S., Landstro, U., Lundstro, R.: Responses of mechanoreceptive afferent units in the glabrous skin of the human hand to sinusoidal skin displacements. Brain Res. 244, 17–25 (1982)CrossRefGoogle Scholar
  25. 25.
    Kovács, I., Fehér, Á.: Non-Fourier information in bandpass noise patterns. Vis. Res. 37, 1167–1175 (1997)CrossRefGoogle Scholar
  26. 26.
    Seebeck, A.: Beobachtungen über einige Bedingungen der Entstehung von Tönen. Ann. Phys. 129, 417–436 (1841)CrossRefGoogle Scholar
  27. 27.
    Plaisier, M.A., Tiest, W.M.B., Kappers, A.M.: Salient features in 3-D haptic shape perception. Atten. Percept. Psychophys. 71, 421–430 (2009)CrossRefGoogle Scholar
  28. 28.
    van Polanen, V., Tiest, W.M.B., Kappers, A.M.: Integration and disruption effects of shape and texture in haptic search. PLoS One 8, e70255 (2013)CrossRefGoogle Scholar
  29. 29.
    Plaisier, M.A., Kappers, A.M.L.: Cold objects pop out! In: Kappers, A.M.L., van Erp, J.B.F., Bergmann Tiest, W.M., van der Helm, F.C.T. (eds.) EuroHaptics 2010. LNCS, vol. 6192, pp. 219–224. Springer, Heidelberg (2010).  https://doi.org/10.1007/978-3-642-14075-4_31CrossRefGoogle Scholar
  30. 30.
    van Polanen, V., Bergmann Tiest, W.M., Kappers, A.M.: Haptic search for hard and soft spheres. PLoS One 7, e45298 (2012)CrossRefGoogle Scholar
  31. 31.
    Kienzle, W., Franz, M.O., Schölkopf, B., Wichmann, F.A.: Center-surround patterns emerge as optimal predictors for human saccade targets. J. Vis. 9, 1–15 (2009)CrossRefGoogle Scholar
  32. 32.
    Wang, W., Shen, J., Shao, L.: Video salient object detection via fully convolutional networks. IEEE Trans. Image Process. 27, 38–49 (2018)MathSciNetCrossRefGoogle Scholar
  33. 33.
    Kruthiventi, S.S., Ayush, K., Babu, R.V.: Deepfix: a fully convolutional neural network for predicting human eye fixations. IEEE Trans. Image Process. 26, 4446–4456 (2017)MathSciNetCrossRefGoogle Scholar
  34. 34.
    Ballard, D.H., Hayhoe, M.M.: Modelling the role of task in the control of gaze. Vis. Cogn. 17, 1185–1204 (2009)CrossRefGoogle Scholar
  35. 35.
    Tatler, B.W., Hayhoe, M.M., Land, M.F., Ballard, D.H.: Eye guidance in natural vision: reinterpreting salience. J. Vis. 11, 1–23 (2011)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Anna Metzger
    • 1
    Email author
  • Matteo Toscani
    • 1
  • Matteo Valsecchi
    • 1
  • Knut Drewing
    • 1
  1. 1.Justus-Liebig University GiessenGiessenGermany

Personalised recommendations