Skip to main content
Log in

Comparing Activation Typicality and Sparsity in a Deep CNN to Predict Facial Beauty

  • Research
  • Published:
Computational Brain & Behavior Aims and scope Submit manuscript

Abstract

Processing fluency, which describes the subjective sensation of ease with which information is processed by the sensory systems and the brain, has become one of the most popular explanations of aesthetic appreciation and beauty. Two metrics have recently been proposed to model fluency: the sparsity of neuronal activation, which describes the concentration of activity in a subset of neurons, and the statistical typicality of activations, which describes how well the encoding of a stimulus matches a reference representation of stimuli of the category to which it belongs. Using convolutional neural networks (CNNs) as a model for the human visual system, this study compares the ability of these metrics to explain variation in facial attractiveness. Our findings show that the sparsity of neuronal activations is a more robust predictor of facial attractiveness than statistical typicality. Refining the reference representation to a single ethnicity or gender does not increase the explanatory power of statistical typicality. However, statistical typicality and sparsity predict facial beauty based on different layers of the CNNs, suggesting that they describe different neural mechanisms underlying fluency.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+
from $39.99 /Month
  • Starting from 10 chapters or articles per month
  • Access and download chapters and articles from more than 300k books and 2,500 journals
  • Cancel anytime
View plans

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Explore related subjects

Discover the latest articles and news from researchers in related subjects, suggested using machine learning.

Data Availability

Data from the Chicago Face Dataset: https://www.chicagofaces.org/

Data from the FairFace dataset: https://github.com/joojs/fairface

Code Availability

Data from Chicago Face Dataset: https://www.chicagofaces.org/ Data from FairFace dataset: https://github.com/joojs/fairface.

References

  • Attwell, D., & Laughlin, S. B. (2001). An energy budget for signaling in the grey matter of the brain. Journal of Cerebral Blood Flow and Metabolism: Official Journal of the International Society of Cerebral Blood Flow and Metabolism, 21(10), 1133–1145.

    Article  PubMed  Google Scholar 

  • Baker, N., Lu, H., Erlikhman, G., & Kellman, P. J. (2020). Local features and global shape information in object classification by deep convolutional neural networks. Vision Research, 172, 46–61.

    Article  PubMed  PubMed Central  Google Scholar 

  • Barlow, H. B. (2001). Redundancy reduction revisited. Network (Bristol, England), 12(3), 241–253.

    Article  PubMed  Google Scholar 

  • Barlow, H. B. (1961). Possible principles underlying the transformations of sensory messages. In W. A., Rosenblith (Eds.), Sensory Communication, pp. 216–234. The MIT Press.

  • Batres, C., & Shiramizu, V. (2023). Examining the “attractiveness halo effect” across cultures. Current Psychology, 42(29), 25515–25519.

  • Brielmann, A. A., & Dayan, P. (2022). A computational model of aesthetic value. Psychological Review, 129(6), 1319–1337.

    Article  PubMed  Google Scholar 

  • Brielmann, A. A., Vale, L., & Pelli, D. G. (2017). Beauty at a glance: The feeling of beauty and the amplitude of pleasure are independent of stimulus duration. Journal of Vision, 17(14), 9.

    Article  PubMed  PubMed Central  Google Scholar 

  • Brielmann, A. A., Berentelg, M., & Dayan, P. (2024). Modelling individual aesthetic judgements over time. Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences, 379(1895), 20220414.

    Article  PubMed  Google Scholar 

  • Chalk, M., Marre, O., & Tkačik, G. (2018). Toward a unified theory of efficient, predictive, and sparse coding. Proceedings of the National Academy of Sciences of the United States of America, 115(1), 186–191.

    Article  PubMed  Google Scholar 

  • Dhar, P., Bansal, A., Castillo, C. D., Gleason, J., Phillips, P. J., & Chellappa, R. (2019). How are attributes expressed in face DCNNs? http://arxiv.org/abs/1910.05657

  • Dibot, N. M., Tieo, S., Mendelson, T. C., Puech, W., & Renoult, J. P. (2023). Sparsity in an artificial neural network predicts beauty: Towards a model of processing-based aesthetics. PLoS Computational Biology, 19(12), e1011703.

    Article  PubMed  PubMed Central  Google Scholar 

  • Forsythe, A., Nadal, M., Sheehy, N., Cela-Conde, C. J., & Sawey, M. (2011). Predicting beauty: Fractal dimension and visual complexity in art. British Journal of Psychology, 102(1), 49–70.

    Article  PubMed  Google Scholar 

  • Graf, L. K. M., Mayer, S., & Landwehr, J. R. (2018). Measuring processing fluency: One versus five items. Journal of Consumer Psychology: The Official Journal of the Society for Consumer Psychology, 28(3), 393–411.

    Article  Google Scholar 

  • Güçlü, U., & van Gerven, M. A. J. (2015). Deep neural networks reveal a gradient in the complexity of neural representations across the ventral stream. The Journal of Neuroscience: The Official Journal of the Society for Neuroscience, 35(27), 10005–10014.

    Article  PubMed  Google Scholar 

  • Halberstadt, J., & Rhodes, G. (2003). It’s not just average faces that are attractive: Computer-manipulated averageness makes birds, fish, and automobiles attractive. Psychonomic Bulletin & Review, 10(1), 149–156.

    Article  Google Scholar 

  • Halberstadt, J., & Winkielman, P. (2014). Easy on the eyes, or hard to categorize: Classification difficulty decreases the appeal of facial blends. Journal of Experimental Social Psychology, 50, 175–183.

    Article  Google Scholar 

  • Hoerl, A. E., & Kennard, R. W. (2000). Ridge regression: Biased estimation for nonorthogonal problems. Technometrics: A Journal of Statistics for the Physical, Chemical, and Engineering Sciences, 42(1), 80.

    Article  Google Scholar 

  • Holzleitner, I. J., Lee, A. J., Hahn, A. C., Kandrik, M., Bovet, J., Renoult, J. P., Simmons, D., Garrod, O., DeBruine, L. M., & Jones, B. C. (2019). Comparing theory-driven and data-driven attractiveness models using images of real women’s faces. Journal of Experimental Psychology. Human Perception and Performance, 45(12), 1589–1595.

    Article  PubMed  Google Scholar 

  • Hurley, N., & Rickard, S. (2009). Comparing measures of sparsity. IEEE Transactions on Information Theory, 55(10), 4723–4741.

    Article  Google Scholar 

  • Hutcheson, F., & Kivy, P. (1973). An inquiry concerning beauty, order, harmony, design. The Hague: Nijhoff.

    Google Scholar 

  • Iigaya, K., Yi, S., Wahle, I. A., Tanwisuth, S., Cross, L., & O’Doherty, J. P. (2023). Neural mechanisms underlying the hierarchical construction of perceived aesthetic value. Nature Communications, 14(1), 127.

    Article  PubMed  PubMed Central  Google Scholar 

  • Jacobsen, T., Schubotz, R. I., Höfel, L., & Cramon, D. Y. (2006). Brain correlates of aesthetic judgment of beauty. NeuroImage, 29(1), 276–285. https://doi.org/10.1016/j.neuroimage.2005.07.010

    Article  PubMed  Google Scholar 

  • Karkkainen, K., & Joo, J. (2021). FairFace: Face attribute dataset for balanced race, gender, and age for bias measurement and mitigation. IEEE Winter Conference on Applications of Computer Vision, 1548–1558. https://doi.org/10.1109/wacv48630.2021.00159

  • Khan, K., Attique, M., Khan, R. U., Syed, I., & Chung, T.-S. (2020). A multi-task framework for facial attributes classification through end-to-end face parsing and deep convolutional neural networks. Sensors (Basel, Switzerland), 20(2), 328.

    Article  PubMed  Google Scholar 

  • Kondo, A., Takahashi, K., & Watanabe, K. (2013). Influence of gender membership on sequential decisions of face attractiveness. Attention, Perception & Psychophysics, 75(7), 1347–1352.

    Article  Google Scholar 

  • Kramer, R. S. S., Jones, A. L., & Sharma, D. (2013). Sequential effects in judgements of attractiveness: The influences of face race and sex. PLoS ONE, 8(12), e82226.

    Article  PubMed  PubMed Central  Google Scholar 

  • Kriegeskorte, N. (2015). Deep neural networks: A new framework for modeling biological vision and brain information processing. Annual Review of Vision Science, 1, 417–446.

    Article  PubMed  Google Scholar 

  • Lee, A. Y., & Labroo, A. A. (2004). The effect of conceptual and perceptual fluency on brand evaluation. Journal of Marketing Research. https://doi.org/10.1509/jmkr.41.2.151.28665

    Article  Google Scholar 

  • Levin, D. T. (1996). Classifying faces by race: The structure of face categories. Journal of Experimental Psychology. Learning, Memory, and Cognition, 22(6), 1364–1382.

    Article  Google Scholar 

  • Lewis, M. B. (2010). Why are mixed-race people perceived as more attractive? Perception, 39(1), 136–138.

    Article  PubMed  Google Scholar 

  • Lindsay, G. W. (2021). Convolutional neural networks as a model of the visual system: Past, present, and future. Journal of Cognitive Neuroscience, 33(10), 2017–2031.

    Article  PubMed  Google Scholar 

  • Locher, P., Krupinski, E. A., Mello-Thoms, C., & Nodine, C. F. (2007). Visual interest in pictorial art during an aesthetic experience. Spatial Vision, 21(1–2), 55–77.

    PubMed  Google Scholar 

  • Ma, D. S., Correll, J., & Wittenbrink, B. (2015). The Chicago face database: A free stimulus set of faces and norming data. Behavior Research Methods, 47(4), 1122–1135.

    Article  PubMed  Google Scholar 

  • Mayer, S., & Landwehr, J. R. (2018). Quantifying visual aesthetics based on processing fluency theory: Four algorithmic measures for antecedents of aesthetic preferences. Psychology of Aesthetics, Creativity, and the Arts. https://doi.org/10.1037/ACA0000187

  • Minda, J. P., & Smith, J. D. (2011). Prototype models of categorization: Basic formulation, predictions, and limitations. In E. M. Pothos & A. J. Wills (Eds.), Formal approaches in categorization (pp. 40–64). Cambridge University Press.

    Chapter  Google Scholar 

  • O’Toole, A. J., & Castillo, C. D. (2021). Face recognition by humans and machines: Three fundamental advances from deep networks. Annual Reviews of Vision Science, 7, 543–570. https://doi.org/10.1146/annurev-vision-093019-111701

    Article  Google Scholar 

  • Olshausen, B. A., & Field, D. J. (1997). Sparse coding with an overcomplete basis set: A strategy employed by V1? Vision Research, 37(23), 3311–3325.

    Article  PubMed  Google Scholar 

  • Olshausen, B. A., & Field, D. J. (2004). Sparse coding of sensory inputs. Current Opinion in Neurobiology, 14(4), 481–487.

    Article  PubMed  Google Scholar 

  • Parde, C. J., Colón, Y. I., Hill, M. Q., Castillo, C. D., Dhar, P., & O’Toole, A. J. (2021). Closing the gap between single-unit and neural population codes: Insights from deep learning in face recognition. Journal of Vision, 21(8), 15.

    Article  PubMed  PubMed Central  Google Scholar 

  • Parkhi, O.M., Vedaldi, A. and Zisserman, A. (2015) Deep face recognition. Proceedings of the British Machine Vision Conference (BMVC). https://doi.org/10.5244/c.29.41

  • Peterson, J. C., Abbott, J. T., & Griffiths, T. L. (2018). Evaluating (and improving) the correspondence between deep neural networks and human representations. Cognitive Science, 42(8), 2648–2669.

    Article  PubMed  Google Scholar 

  • Potter, T., & Corneille, O. (2008). Locating attractiveness in the face space: Faces are more attractive when closer to their group prototype. Psychonomic Bulletin & Review, 15(3), 615–622.

    Article  Google Scholar 

  • Reber, R., Winkielman, P., & Schwarz, N. (1998). Effects of perceptual fluency on affective judgments. Psychological Science, 9(1), 45–48.

    Article  Google Scholar 

  • Reber, R., Schwarz, N., & Winkielman, P. (2004). Processing fluency and aesthetic pleasure: Is beauty in the perceiver’s processing experience? Personality and Social Psychology Review: An Official Journal of the Society for Personality and Social Psychology, Inc, 8(4), 364–382.

    Article  PubMed  Google Scholar 

  • Redies, C. (2007). A universal model of esthetic perception based on the sensory coding of natural stimuli. Spatial Vision, 21(1–2), 97–117.

    PubMed  Google Scholar 

  • Renoult, J. P., & Mendelson, T. C. (2019). Processing bias: Extending sensory drive to include efficacy and efficiency in information processing. Proceedings. Biological Sciences, 286(1900), 20190165.

    PubMed  PubMed Central  Google Scholar 

  • Renoult, J. P., Bovet, J., & Raymond, M. (2016). Beauty is in the efficient coding of the beholder. Royal Society Open Science, 3(3), 160027.

    Article  PubMed  PubMed Central  Google Scholar 

  • Rhodes, G. (2006). The evolutionary psychology of facial beauty. Annual Review of Psychology, 57(1), 199–226.

    Article  PubMed  Google Scholar 

  • Rhodes, G., Simmons, L. W., & Peters, M. (2005). Attractiveness and sexual behavior: Does attractiveness enhance mating success? Evolution and Human Behavior: Official Journal of the Human Behavior and Evolution Society, 26(2), 186–201.

    Article  Google Scholar 

  • Ryali, C. K., & Yu, A. J. (2018). Beauty-in-averageness and its contextual modulations: A Bayesian statistical account. In bioRxiv. Biorxiv. https://doi.org/10.1101/360651

    Article  Google Scholar 

  • Ryali, C. K., Goffin, S., Winkielman, P., & Yu, A. J. (2020). From likely to likable: The role of statistical typicality in human social assessment of faces. Proceedings of the National Academy of Sciences of the United States of America, 117(47), 29371–29380.

    Article  PubMed  PubMed Central  Google Scholar 

  • Sexton, N. J., & Love, B. C. (2022). Reassessing hierarchical correspondences between brain and deep networks through direct interface. Science Advances, 8(28), eabm2219.

    Article  PubMed  PubMed Central  Google Scholar 

  • Simoncelli, E. P., & Olshausen, B. A. (2001). Natural image statistics and neural representation. Annual Review of Neuroscience, 24(1), 1193–1216.

    Article  PubMed  Google Scholar 

  • Simonyan, K., & Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. http://arxiv.org/abs/1409.1556

  • Street, N., Forsythe, A. M., Reilly, R., Taylor, R., & Helmy, M. S. (2016). A complex story: Universal preference vs. individual differences shaping aesthetic response to fractals patterns. Frontiers in Human Neuroscience, 10. 213 https://doi.org/10.3389/fnhum.2016.00213

  • Tieo, S., Dezeure, J., Cryer, A., Lepou, P., Charpentier, M. J. E., & Renoult, J. P. (2023). Social and sexual consequences of facial femininity in a non-human primate. Iscience, 26(10), 107901.

    Article  PubMed  PubMed Central  Google Scholar 

  • Trujillo, L. T., & Anderson, E. M. (2023). Facial typicality and attractiveness reflect an ideal dimension of face structure. Cognitive Psychology, 140(101541), 101541.

    Article  PubMed  Google Scholar 

  • Voorspoels, W., Storms, G., & Vanpaemel, W. (2011). Representation at different levels in a conceptual hierarchy. Acta Psychologica, 138(1), 11–18.

    Article  PubMed  Google Scholar 

  • Wallis, G., Siebeck, U. E., Swann, K., Blanz, V., & Bülthoff, H. H. (2008). The prototype effect revisited: Evidence for an abstract feature model of face recognition. Journal of Vision, 8(3), 20–1-15.

    Article  Google Scholar 

  • Winkielman, P., Schwarz, N., Fazendeiro, T. A., & Reber, R. (2003). The hedonic marking of processing fluency: Implications for evaluative judgment. In J. Musch & K. C. Klauer (Eds.), The psychology of evaluation: Affective processes in cognition and emotion (pp. 189–217). Lawrence Erlbaum Associates Publishers.

    Google Scholar 

  • Winkielman, P., Halberstadt, J., Fazendeiro, T., & Catty, S. (2006). Prototypes are attractive because they are easy on the mind. Psychological Science, 17(9), 799–806.

    Article  PubMed  Google Scholar 

  • Winkielman, P., Huber, D. E., Kavanagh, L., & Schwarz, N. (2012). Fluency of consistency: When thoughts fit nicely and flow smoothly. In B. Gawronski & F. Strack (Eds.), Cognitive consistency: A fundamental principle in social cognition (pp. 89–111). Guilford Press.

    Google Scholar 

  • Wurtz, P., Reber, R., & Zimmermann, T. D. (2008). The feeling of fluent perception: A single experience from multiple asynchronous sources. Consciousness and Cognition, 17(1), 171–184.

    Article  PubMed  Google Scholar 

Download references

Funding

This study was funded by the Agence Nationale de la Recherche (ANR-20-CE02-0005–01), the National Science Foundation (NSF IOS 2026334), and by the Mission for Interdisciplinarity of the French National Center for Scientific Research (Programme Interne Blanc CNRS MITI 2023.1 – DEEPCOM project).

Author information

Authors and Affiliations

Authors

Contributions

Sonia Tieo: Writing – Original draft, Methodology, Supervision, Formal Analysis, Software. Tamra C.Mendelson: Conceptualization, Supervision, Funding Acquisition, Writing – Review & Editing. Julien P. Renoult: Conceptualization, Supervision, Funding Acquisition, Writing – Review & Editing. William Puech: Conceptualization, Supervision, Funding Acquisition, Writing – Review & Editing. Melvin Bardin: Methodology, Formal Analysis, Software Roland Bertin-Johannet: Methodology, Formal Analysis, Software Nicolas Dibot: Methodology, Supervision.

Corresponding authors

Correspondence to Sonia Tieo or Julien P. Renoult.

Ethics declarations

Ethics Approval

This research only includes simulation studies, and therefore, ethical approval was not required.

Consent to Participate

This does not apply because this research does not involve any participants.

Consent for Publication

This does not apply because this research does not involve any participants.

Competing Interests

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (DOCX 118 KB)

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tieo, S., Bardin, M., Bertin-Johannet, R. et al. Comparing Activation Typicality and Sparsity in a Deep CNN to Predict Facial Beauty. Comput Brain Behav 8, 249–261 (2025). https://doi.org/10.1007/s42113-024-00231-7

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s42113-024-00231-7

Keywords