Advertisement

The Potential of Collaborative Document Evaluation for Science

  • Jöran Beel
  • Béla Gipp
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5362)

Abstract

Peer review and citation analysis are the two most common approaches for quality evaluations of scientific publications, although they are subject to criticism for various reasons. This paper outlines the problems of citation analysis and peer review and introduces Collaborative Document Evaluation as a supplement or possibly even a substitute. Collaborative Document Evaluation aims to enable the readers of publications to act as peer reviewers and share their evaluations in the form of ratings, annotations, links and classifications via the internet. In addition, Collaborative Document Evaluation might well enhance the search for publications. In this paper the implications of Collaborative Document Evaluation for the scientific community are discussed and questions are asked as to how to create incentives for scientists to participate.

Keywords

open peer review citation analysis alternative research policy 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Godlee, F., Gale, C., Martyn, C.: Effect on the Quality of Peer Review of Blinding Reviewers and Asking Them to Sign Their Reports. In: JAMA, pp. 237–240 (1998)Google Scholar
  2. 2.
    Relman, A.S.: Peer Review in Scientific Journals - What Good Is It? New England Journal of Medicine 153, 520–522 (1990)Google Scholar
  3. 3.
    Lee, D., Jaewoo, K., Prasenjit, M., Giles, L., Byung-Won, O.: Are your citations clean? Communications of the ACM 50, 33–38 (2007)CrossRefGoogle Scholar
  4. 4.
    MacRoberts, M.H., MacRoberts, B.: Problems of Citation Analysis. Scientometrics 36, 435–444 (1996)CrossRefGoogle Scholar
  5. 5.
    Yates, L.: Is Impact a Measure of Quality? European Educational Research Journal 4, 391–403 (2005)CrossRefGoogle Scholar
  6. 6.
    Kochan, C.A., Budd, J.M.: The persistence of fraud in the literature: the Darsee case. JASIS 43, 488–493 (1992)CrossRefGoogle Scholar
  7. 7.
    Nature’s peer review trial, Nature (2006), http://www.nature.com/nature/peerreview/debate/nature05535.html
  8. 8.
    Ball, P.: The more, the wikier, Nature (2007) , http://www.nature.com/news/2007/070226/full/news070226-6.html
  9. 9.
    Nyblod, R., Byrne, J.: USPTO Extends and Expands Peer Review Pilot (July 2008), http://www.uspto.gov/web/offices/com/speeches/08-26.htm
  10. 10.
    Beel, J., Gipp, B.: Collaborative Document Evaluation: An Alternative Approach to Classic Peer Review. In: proceedings of World Academy of Science, Engineering and Technology, vol. 31, pp. 410–413 (2008) ISSN 1307-6884Google Scholar
  11. 11.
    Gipp, B., Beel, J.: Scienstein: A Research Paper Recommender System (not published yet)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Jöran Beel
    • 1
  • Béla Gipp
    • 1
  1. 1.Department of Computer ScienceOtto-von-Guericke UniversityMagdeburgGermany

Personalised recommendations