Crowdsourcing Peer Review: As We May Do

  • Michael SopranoEmail author
  • Stefano Mizzaro
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 988)


This paper describes Readersourcing 2.0, an ecosystem providing an implementation of the Readersourcing approach proposed by Mizzaro [10]. Readersourcing is proposed as an alternative to the standard peer review activity that aims to exploit the otherwise lost opinions of readers. Readersourcing 2.0 implements two different models based on the so-called codetermination algorithms. We describe the requirements, present the overall architecture, and show how the end-user can interact with the system. Readersourcing 2.0 will be used in the future to study also other topics, like the idea of shepherding the users to achieve a better quality of the reviews and the differences between a review activity carried out with a single-blind or a double-blind approach.


Scholarly publishing Peer review Crowdsourcing 


  1. 1.
    CC Attribution 4.0 International Public License (2010).
  2. 2.
    OpenReview (2016).
  3. 3.
    Akst, J.: I Hate Your Paper: many say the peer review system is broken. Here’s how some journals are trying to fix it. Sci. 24, 36 (2010). Scholar
  4. 4.
    Arms, W.Y.: What are the alternatives to peer review? Quality control in scholarly publishing on the web. JEP 8(1) (2002).
  5. 5.
    De Alfaro, L., Faella, M.: TrueReview: a platform for post-publication peer review. CoRR (2016).
  6. 6.
    Dow, S., Kulkarni, A., Klemmer, S., Hartmann, B.: Shepherding the crowd yields better work. In: Proceedings of ACM 2012 CSCW, pp. 1013–1022. ACM (2012)Google Scholar
  7. 7.
    Medo, M., Rushton Wakeling, J.: The effect of discrete vs. continuous-valued ratings on reputation and ranking systems. EPL 91(4), 48004 (2010). Scholar
  8. 8.
    Meyer, B.: Fixing the process of computer science refereeing, October 2010.
  9. 9.
    Mizzaro, S.: Quality control in scholarly publishing: a new proposal. JASIST 54(11), 989–1005 (2003). Scholar
  10. 10.
    Mizzaro, S.: Readersourcing - a manifesto. JASIST 63(8), 1666–1672 (2012). Scholar
  11. 11.
    Soprano, M., Mizzaro, S.: Readersourcing 2.0: RS\(\_\)PDF, October 2018.
  12. 12.
    Soprano, M., Mizzaro, S.: Readersourcing 2.0: RS\(\_\)Rate, October 2018.
  13. 13.
    Soprano, M., Mizzaro, S.: Readersourcing 2.0: RS\(\_\)Server, October 2018.
  14. 14.
    Soprano, M., Mizzaro, S.: Readersourcing 2.0: technical documentation, October 2018.
  15. 15.
    Sun, Y., et al.: Crowdsourcing information extraction for biomedical systematic reviews. CoRR abs/1609.01017 (2016)Google Scholar
  16. 16.
    The Economist: Some science journals that claim to peer review papers do not do so (2018).
  17. 17.
    Tomkins, A., Zhang, M., Heavlin, W.D.: Reviewer bias in single- versus double-blind peer review. PNAS 114(48), 12708–12713 (2017). Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Department of Mathematics, Computer Science, and PhysicsUniversity of UdineUdineItaly

Personalised recommendations