OpenResearch: Collaborative Management of Scholarly Communication Metadata

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10024)


Scholars often need to search for matching, high-profile scientific events to publish their research results. Information about topical focus and quality of events is not made sufficiently explicit in the existing communication channels where events are announced. Therefore, scholars have to spend a lot of time on reading and assessing calls for papers but might still not find the right event. Additionally, events might be overlooked because of the large number of events announced every day. We introduce OpenResearch, a crowd sourcing platform that supports researchers in collecting, organizing, sharing and disseminating information about scientific events in a structured way. It enables quality-related queries over a multidisciplinary collection of events according to a broad range of criteria such as acceptance rate, sustainability of event series, and reputation of people and organizations. Events are represented in different views using map extensions, calendar and time-line visualizations. We have systematically evaluated the timeliness, usability and performance of OpenResearch.


Scientific events Collaborative knowledge acquisition Semantic publishing Semantic wikis Linked data 



This work has been partially funded by the European Commission with a grant for the project OpenAIRE (GA no. 643410) and by a grant for the project LEDS (GA no. 03WKCG11C) from the German Federal Ministry of Education and Research (BMBF). We thank Yakun Li for his technical work on


  1. 1.
    Alexiou, G. OpenAIRE LOD services: scholarly communication data as linked data. In: 2nd Workshop, SAVE-SD. LNCS. Springer (2016)Google Scholar
  2. 2.
    Iorio, A.D., Lange, C., Dimou, A., Vahdati, S.: Semantic publishing challenge – assessing the quality of scientific output by information extraction and interlinking. In: Gandon, F., Cabrio, E., Stankovic, M., Zimmermann, A. (eds.) SemWebEval 2015. CCIS, vol. 548, pp. 65–80. Springer, Heidelberg (2015). doi: 10.1007/978-3-319-25518-7_6 CrossRefGoogle Scholar
  3. 3.
    Hurtado Martín, G., Schockaert, S., Cornelis, C., Naessens, H.: An exploratory study on content-based filtering of call for papers. In: Lupu, M., Kanoulas, E., Loizides, F. (eds.) IRFC 2013. LNCS, vol. 8201, pp. 58–69. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-41057-4_7 CrossRefGoogle Scholar
  4. 4.
    Issertial, L., Tsuji, H.: Information extraction for call for paper. Int. J. Knowl. Syst. Sci. (IJKSS) 6(4), 35–49 (2015)CrossRefGoogle Scholar
  5. 5.
    Juran, J.M.: Juran’s Quality Control Handbook, 4th edn. McGraw-Hill (Tx), New York (1974)Google Scholar
  6. 6.
    Knight, S.-A., Burn, J.M.: Developing a framework for assessing information quality on the World Wide Web. Informing Sci.: Int. J. Emerg. Transdiscipline 8(5), 159–172 (2005)Google Scholar
  7. 7.
    Mayr, P., Momeni, F., Lange, C.: Opening communication social sciences: supporting open peer review with fidus writer. In: EA Conference (2016)Google Scholar
  8. 8.
    Tomberg, V., et al.: Towards, a comprehensive call ontology for research 2.0. In: i-KNOW. ACM (2011)Google Scholar
  9. 9.
    Wang, H.-D., Wu, J.: Collaborative filtering of call for papers. In: (7th Dec.). IEEE, pp. 963–970 (2015)Google Scholar
  10. 10.
    Xia, J., et al.: Optimizing academic conference classification using social tags. In: CSE (Hong Kong, China). IEEE, pp. 289–294 (2010)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.Enterprise Information Systems (EIS)University of BonnBonnGermany
  2. 2.AKSW GroupLeipzig UniversityLeipzigGermany
  3. 3.Fraunhofer, Intelligent Analysis and Information Systems (IAIS)Sankt AugustinGermany

Personalised recommendations