Skip to main content

Crowdsourcing Peer Review: As We May Do

  • Conference paper
  • First Online:

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 988))

Abstract

This paper describes Readersourcing 2.0, an ecosystem providing an implementation of the Readersourcing approach proposed by Mizzaro [10]. Readersourcing is proposed as an alternative to the standard peer review activity that aims to exploit the otherwise lost opinions of readers. Readersourcing 2.0 implements two different models based on the so-called codetermination algorithms. We describe the requirements, present the overall architecture, and show how the end-user can interact with the system. Readersourcing 2.0 will be used in the future to study also other topics, like the idea of shepherding the users to achieve a better quality of the reviews and the differences between a review activity carried out with a single-blind or a double-blind approach.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    https://www.google.com/chrome/.

  2. 2.

    https://rubyonrails.org/.

  3. 3.

    https://pdfbox.apache.org/.

References

  1. CC Attribution 4.0 International Public License (2010). https://creativecommons.org/licenses/by/4.0/legalcode

  2. OpenReview (2016). https://openreview.net/

  3. Akst, J.: I Hate Your Paper: many say the peer review system is broken. Here’s how some journals are trying to fix it. Sci. 24, 36 (2010). http://www.the-scientist.com/2010/8/1/36/1/

    Google Scholar 

  4. Arms, W.Y.: What are the alternatives to peer review? Quality control in scholarly publishing on the web. JEP 8(1) (2002). https://doi.org/10.3998/3336451.0008.103

  5. De Alfaro, L., Faella, M.: TrueReview: a platform for post-publication peer review. CoRR (2016). http://arxiv.org/abs/1608.07878

  6. Dow, S., Kulkarni, A., Klemmer, S., Hartmann, B.: Shepherding the crowd yields better work. In: Proceedings of ACM 2012 CSCW, pp. 1013–1022. ACM (2012)

    Google Scholar 

  7. Medo, M., Rushton Wakeling, J.: The effect of discrete vs. continuous-valued ratings on reputation and ranking systems. EPL 91(4), 48004 (2010). http://stacks.iop.org/0295-5075/91/i=4/a=48004

    Article  Google Scholar 

  8. Meyer, B.: Fixing the process of computer science refereeing, October 2010. https://cacm.acm.org/blogs/blog-cacm/100030-fixing-the-process-of-computer-science-refereeing/fulltext

  9. Mizzaro, S.: Quality control in scholarly publishing: a new proposal. JASIST 54(11), 989–1005 (2003). https://doi.org/10.1002/asi.22668

    Article  Google Scholar 

  10. Mizzaro, S.: Readersourcing - a manifesto. JASIST 63(8), 1666–1672 (2012). https://onlinelibrary.wiley.com/doi/abs/10.1002/asi.22668

    Article  Google Scholar 

  11. Soprano, M., Mizzaro, S.: Readersourcing 2.0: RS\(\_\)PDF, October 2018. https://doi.org/10.5281/zenodo.1442597

  12. Soprano, M., Mizzaro, S.: Readersourcing 2.0: RS\(\_\)Rate, October 2018. https://doi.org/10.5281/zenodo.1442599

  13. Soprano, M., Mizzaro, S.: Readersourcing 2.0: RS\(\_\)Server, October 2018. https://doi.org/10.5281/zenodo.1442630

  14. Soprano, M., Mizzaro, S.: Readersourcing 2.0: technical documentation, October 2018. https://doi.org/10.5281/zenodo.1443371

  15. Sun, Y., et al.: Crowdsourcing information extraction for biomedical systematic reviews. CoRR abs/1609.01017 (2016)

    Google Scholar 

  16. The Economist: Some science journals that claim to peer review papers do not do so (2018). https://www.economist.com/science-and-technology/2018/06/23/some-science-journals-that-claim-to-peer-review-papers-do-not-do-so

  17. Tomkins, A., Zhang, M., Heavlin, W.D.: Reviewer bias in single- versus double-blind peer review. PNAS 114(48), 12708–12713 (2017). http://www.pnas.org/content/114/48/12708

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Michael Soprano .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Soprano, M., Mizzaro, S. (2019). Crowdsourcing Peer Review: As We May Do. In: Manghi, P., Candela, L., Silvello, G. (eds) Digital Libraries: Supporting Open Science. IRCDL 2019. Communications in Computer and Information Science, vol 988. Springer, Cham. https://doi.org/10.1007/978-3-030-11226-4_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-11226-4_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-11225-7

  • Online ISBN: 978-3-030-11226-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics