Improved viewpoint entropy to evaluate material appearance under various lighting positions

  • Shoji YamamotoEmail author
  • Yuto Hirasawa
  • Ryota Domon
  • Hiroshi Kintou
  • Norimichi Tsumura
Regular Paper


In this paper, we proposed a useful calculation method to perform the perception-based selection of lighting position that emphasizes the material appearance of CG object. The proposed method is based on the conventional viewpoint entropy which is used to find an appropriate viewing angle. To find an appropriate shot for material appearance, we first identified the important surface that has information of gloss reflection related to material appearance with eye tracking equipment. Next, we modified the equation of viewpoint entropy by adding a weight coefficient for emphasizing at important surface. Since this viewpoint entropy retains the independency for object shape, light direction, and viewing direction, our proposed method can extract only important light position which is most representative of the material appearance. From the results of verification with changing shape and material of CG object, we confirmed that our selection of lighting position accomplishes the fine agreement with subjective evaluation which is imposed on the selection of appropriate scene with emphasis of material appearance.


Visual entropy Gloss Material appearance Computer graphics 



This research was partially supported by the Ministry of Education, Science, Sports and Culture, Japan Grant-in-Aid for Scientific Research, Brain and Information Science on SHITSUKAN, 23135530 (2012), 25135707 (2013), and Grant-in-Aid for Scientific Research(C), 15K00415 (2015), respectively.


  1. 1.
    Weyrich, M., Drews, P.: An interactive environment for virtual manufacturing: the virtual workbench. Comput. Ind. 38(1), 5–15 (1999)CrossRefGoogle Scholar
  2. 2.
    Pharr, M., Humphreys, G.: Physically based rendering: from theory to implementation. Morgan Kaufmann, Burlington (2010)Google Scholar
  3. 3.
    Pellacini, F., Ferwerda, F.A., Greenberg, D.P.: Toward a psychophysically-based light reflection model for image synthesis. In: SIGGRAPH ’00 Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques, pp. 55–64 (2005)Google Scholar
  4. 4.
    Montizambert, D.: Creative Lighting Techniques for Studio Photographers, 2nd edn. Amherst Media, St. Amherst (2003)Google Scholar
  5. 5.
    Hunter, F., Biver, S., Fuqua, P.: Light Science and Magic: An Introduction to Photographic Lighting. Focal Press, Waltham (2007)Google Scholar
  6. 6.
    Rusinkiewicz, S., Burns, M., DeCarlo, D.: Exaggerated shading for depicting shape and detail. Proc. SIGGRAPH 25, 3 (2006)Google Scholar
  7. 7.
    Vergne, R., Pacanowski, R., Barla, P., Granier, X., Schlick, C.: Light warping for enhanced surface depiction. Proc. SIGGRAPH 28, 3 (2009)Google Scholar
  8. 8.
    Bousseau, A., Chapoulie, E., Ramamoorthi, R., Agrawala, M.: Optimizing environment maps for material depiction. In: EGSR '11 Proceedings of the 22th Eurographics Conference on Rendering, pp. 1171–1180 (2011)CrossRefGoogle Scholar
  9. 9.
    V´azquezz, P., Feixasz, M., Sbertz, M., Heidrich, W.: Viewpoint selection using viewpoint entropy. In: VMV ’01 Proceedings of the Vision Modeling and Visualization Conference (2001)Google Scholar
  10. 10.
    Nicodemus, F.E., Richmond, J.C., Hsia, J.J.: Geometrical Considerations and Nomenclature for Reflectance”, pp. 1–57. National Bureau of Standards of United States, Gaithersburg (1977)Google Scholar
  11. 11.
    Ward, G.J.: Measuring and modeling anisotropic reflection. ACM SIGGRAPH Comput. Graph. 26(2), 265–272 (1992)CrossRefGoogle Scholar
  12. 12.
    Yamamoto, S, Sawabe, M., Yamauchi, Y., Tsumura, N.: Ascertainment of perceptual classification for material appearance. In: CIC20 Proceedings of the 20th Color Imaging Conference, pp. 94–99 (2012)Google Scholar
  13. 13.
    Hirasawa, Y., Yamamoto, S., Domon, R., Kintou, H., Tsumura, N.: Viewpoint entropy for material appearance. In: CIC24 Proceedings of the 24th Color Imaging Conference, pp. 152–156 (2016)CrossRefGoogle Scholar
  14. 14.
    Morimoto, C.H., Mimica, M.R.M.: Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 98(1), 4–24 (2005)CrossRefGoogle Scholar
  15. 15.
    Prince, S.J.D.: Computer Vision: Models, Learning and Inference. Cambridge University Press, Cambridge (2012)CrossRefGoogle Scholar
  16. 16.
    Chadwick, A.C., Kentridge, R.W.: The perception of gloss: a review. Vision. Res. 109(Part B), 221–235 (2013)Google Scholar
  17. 17.
    Bousseau, A., O’Shea, J.P., Durand, F., Ramamoorthi, R., Agrawala, M.: Gloss perception in painterly and cartoon rendering. ACM Trans. Graph. 32(2), 18 (2013)CrossRefGoogle Scholar
  18. 18.
    Shirley, P.S.: Physically based lighting calculations for computer graphics. Doctoral Dissertation, PhD thesis, University of Illinois at Urbana-Champaign Champaign, IL, USA (1991)Google Scholar

Copyright information

© The Optical Society of Japan 2019

Authors and Affiliations

  1. 1.Tokyo Metropolitan College of Industrial TechnologyTokyoJapan
  2. 2.Graduate School of Advanced Integration ScienceChiba UniversityChibaJapan
  3. 3.Nikon CorporationTokyoJapan

Personalised recommendations