An Investigation of Social-Behavioral Phenomena in the Peer-Review Processes of Scientific Foundations

  • George KleinerEmail author
  • Maxim RybachukEmail author
  • Dmitry Ushakov
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 1079)


A huge amount of the issues in the realm of scientific endeavor are decided by member of expert communities in various fields. Decisions that sanction the funding of project proposals are based on a voting process. Such decision-making is particularly applied in the evaluation of applications to publicly-funded initiatives, which include the awarding of higher academic degrees and titles, in competitions to fill personnel vacancies, and other similar areas.

In such situations, experts (electors) individually decide in favor of a particular applicant based on specific objective criteria, as well by subjective consideration of their decision’s repercussion in the profession field and the impact of the decisions on the experts’ reputation. The result of such choices may depend on the psychological qualities and the current mood of the expert. The selection of the experts and their assignation to particular evaluation projects is often random. As a result, the collective adjudication on such projects is comprised of the interweaving of several objective and subjective factors.

In this paper, the authors examine the competitive selection process for scientific projects in applications for funding from scientific foundations. A simulated “peer review” model is utilized, designed to analyze a number of experts’ economic and psychological characteristics and their group affiliation in the form of scientific schools.

The authors use qualitative analysis concerning the impact of changes reputations of experts on their decisions in the scientific community. Thus, the research results herein show the dynamics of the scientific and expert community structure. The model is agent-oriented and is a convenient tool for modeling the process of competitive selection in project funding applications.


Public choice Alternative choice Science experts Psychological characteristics Agent-oriented modeling Multi-stage choice Reputation Scientific school 


  1. 1.
    Science Indicators: 2018: statistical compilation. National Research University “Higher School of Economics”, p. 320. HSE, Moscow (2018) (in Russian)Google Scholar
  2. 2.
    Mindeli, L.E., Chernykh, S.I.: Funding of basic research in Russia: modern realities and forecasts. Stud. Russ. Econ. Dev. 27(3), 318–325 (2016)CrossRefGoogle Scholar
  3. 3.
    Ganguli, I.: Saving soviet science: the impact of grants when government R&D funding disappears. Am. Econ. J.: Appl. Econ. 9(2), 165–201 (2017). Scholar
  4. 4.
    Federal Research and Development Funding: FY2018 Congressional Research Service, 25 January. Accessed 30 Apr 2019 (2018)
  5. 5.
    Ilina, I.E., Zharova, E.N., Burlankov, S.P.: Analysis of the efficacy of public spending on research and development in state programs. Stud. Russ. Econ. Dev. 29(2), 207–213 (2018)CrossRefGoogle Scholar
  6. 6.
    Lee, C.J., Sugimoto, C.R., Zhang, G., Cronin, B.: Bias in peer review. J. Am. Soc. Inform. Sci. Technol. 64(1), 2–17 (2013). Scholar
  7. 7.
    Walker, R., da Silva, P.R.: Emerging trends in peer review—a survey. Front. Neurosci. 9, 169 (2015). Scholar
  8. 8.
    García, J.A., Rodriguez-Sánchez, R., Fdez-Valdivia, J.: Bias and effort in peer review. J. Assoc. Inform. Sci. Technol. 66(10), 2020–2030 (2015). Scholar
  9. 9.
    Morey, R.D., et al.: The peer reviewers’ openness initiative: incentivizing open research practices through peer review. Roy. Soc. Open Sci. 3(1), 150547 (2016). Scholar
  10. 10.
    Mutz, R., Bornmann, L., Daniel, H.D.: Does gender matter in grant peer review? Zeitschrift für Psychol. 220(2), 121–129 (2012). Scholar
  11. 11.
    Fortin, J.M., Currie, D.J.: Big science vs. little science: how scientific impact scales with funding. PloS One 8(6), e65263 (2013). Scholar
  12. 12.
    Rijcke, S.D., Wouters, P.F., Rushforth, A.D., Franssen, T.P., Hammarfelt, B.: Evaluation practices and effects of indicator use—a literature review. Res. Eval. 25(2), 161–169 (2016). Scholar
  13. 13.
    Bollen, J., Crandall, D., Junk, D., Ding, Y., Börner, K.: An efficient system to fund science: from proposal review to peer-to-peer distributions. Scientometrics 110(1), 521–528 (2017). Scholar
  14. 14.
    Li, D., Agha, L.: Big names or big ideas: do peer-review panels select the best science proposals? Science 348(6233), 434–438 (2015). Scholar
  15. 15.
    Rennie, D.: Let’s make peer review scientific. Nat. News 535(7610), 31 (2016). Scholar
  16. 16.
    Wicherts, J.M.: Peer review quality and transparency of the peer-review process in open access and subscription journals. PLoS One 11(1), e0147913 (2016). Scholar
  17. 17.
    Squazzoni, F., Grimaldo, F., Marušić, A.: Publishing: journals could share peer-review data. Nature 546(7658), 352 (2017)CrossRefGoogle Scholar
  18. 18.
    Gropp, R.E., Glisson, S., Gallo, S., Thompson, L.: Peer review: a system under stress. Bioscience 67(5), 407–410 (2017). Scholar
  19. 19.
    Roumbanis, L. Peer review or lottery? A critical analysis of two different forms of decision-making mechanisms for allocation of research grants. Sci. Technol. Hum. Values. 0162243918822744 (2019).
  20. 20.
    Makarov, V.L., Bakhtizin, A.R., Sushko, E.D.: Simulation of population’s reproductive behaviour patterns within an agent-oriented regional model. R-Economy 1(3), 478–486 (2015)Google Scholar
  21. 21.
    Makarov, V.L., Bakhtizin, A.R., Sushko, E.D., Vasenin, V.A., Borisov, V.A., Roganov, V.A.: Supercomputer technologies in social sciences: agent-oriented demographic models. Herald Russ. Acad. Sci. 86(3), 248–257 (2016)CrossRefGoogle Scholar
  22. 22.
    Kulivets, S.G., Ushakov, D.V.: Modeling relationship between cognitive abilities and economic achievements. Psychol. J. High. School Econ. 13(4), 636–648 (2016)Google Scholar
  23. 23.
    Oleinik, A.: Knowledge and Networking: On Communication in the Social Sciences, p. 238. Transaction Publishers, New Brunswick (2014). Scholar
  24. 24.
    Wilensky, U., Rand, W.: An Introduction to Agent-Based Modeling: Modeling Natural, Social, and Engineered Complex Systems with NetLogo. The MIT Press, Cambridge (2015). 504 pGoogle Scholar
  25. 25.
    Banitz, T., Gras, A., Ginovart, M.: Individual-based modeling of soil organic matter in NetLogo: transparent, user-friendly, and open. Environ. Model Softw. 71, 39–45 (2015)CrossRefGoogle Scholar
  26. 26.
    Gaudou, B., Lang, C., Marilleau, N., Savin, G., Coyrehourcq, S.R., Nicod, J.M.: Netlogo, an open simulation environment. In: Agent-based Spatial Simulation with NetLogo, vol. 2, pp. 1–36. IS TE – Elsevier, London (2017)Google Scholar
  27. 27.
    Thiele, J.C.: R marries NetLogo: introduction to the RNetLogo package. J. Stat. Softw. 58(2), 1–41 (2014)CrossRefGoogle Scholar
  28. 28.
    Thiele, J.C., Grimm, V.: NetLogo meets R: linking agent-based models with a toolbox for their analysis. Environ. Model Softw. 25(8), 972–974 (2010)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Central Economics and Mathematics Institute of the Russian Academy of SciencesMoscowRussia
  2. 2.Financial University Under the Government of the Russian FederationMoscowRussia
  3. 3.Institute of Psychology of the Russian Academy of SciencesMoscowRussia

Personalised recommendations