Evaluating Academic Answer Quality: A Pilot Study on ResearchGate Q&A

  • Lei LiEmail author
  • Daqing He
  • Chengzhi Zhang
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9751)


Evaluating the quality of academic content on social media is a critical research topic for the further development of scholarly collaboration on the social web. This pilot study used the question/answer pairs of Library and Information Science (LIS) domain on ResearchGate to examine how scholars assess the quality of academic answers on the social web. This study aims to: (1) examine the aspects used by scholars in assessing the academic answer quality and identify the objective and subjective aspects; (2) future verify the existing of subjective aspects when judging the academic answers’ quality by detecting the agreement of evaluation between different evaluators. Though concluding the evaluation criteria of the academic content quality from the related works, the authors identified nine aspects of the quality evaluation and mapped the participants’ responds of the reasons for the answer quality judgment to the identified quality judgment framework. We found that aspects that related to the content of academic text and the users’ beliefs and preferences are the two common used aspects to judge the academic answer quality, which indicated that not only the text itself, but also the evaluator’s beliefs and preferences influence the quality judgment. Another finding is the agreement level between different evaluator’s judgments is very low, compared with other non-academic text judgment agreement level.


Academic answer quality Academic social Q&A Social media ResearchGate 



This work is supported by the Major Projects of National Social Science Fund (No. 13&ZD174), the National Social Science Fund Project (No. 14BTQ033).


  1. 1.
    Ferschke, O.: The quality of content in open online collaboration platforms. Dissertation (2014)Google Scholar
  2. 2.
    Thelwall, M., Kousha, K.: social network or academic network? J. Assoc. Inf. Sci. Technol. 65(4), 721–731 (2014)CrossRefGoogle Scholar
  3. 3.
    Li, L., He, D., Jeng, W., Goodwin, S., Zhang, C.: Answer quality characteristics and prediction on an academic Q&A site: a case study on ResearchGate. In: Proceedings of the 24th International Conference on World Wide Web Companion, pp. 1453–1458. International World Wide Web Conferences Steering Committee, May 2015Google Scholar
  4. 4.
    Cheng, R., Vassileva, J.: Design and evaluation of an adaptive incentive mechanism for sustained educational online communities. User Model. User-Adap. Interact. 16(3–4), 321–348 (2006)CrossRefGoogle Scholar
  5. 5.
    Tenopir, C., Levine, K., Allard, S., Christian, L., Volentine, R., Boehm, R., Watkinson, A.: Trustworthiness and authority of scholarly information in a digital age: results of an international questionnaire. J. Assoc. Inf. Sci. Technol. (2015)Google Scholar
  6. 6.
    Jeng, W., DesAutels, S., He, D., Li, L.: Information exchange on an academic social networking site: a multi-discipline comparison on ResearchGate Q&A (2015). arXiv preprint arXiv:1511.03597
  7. 7.
    Watkinson, A., Nicholas, D., Thornley, C., Herman, E., Jamali, H.R., Volentine, R., Tenopir, C.: Changes in the digital scholarly environment and issues of trust: an exploratory, qualitative analysis. Inf. Process. Manag. 45, 375–381 (2015)Google Scholar
  8. 8.
    Jeng, W., He, D., Jiang, J.: User participation in an academic social networking service: a survey of open group users on Mendeley. J. Assoc. Inf. Sci. Technol. 66(5), 890–904 (2015)CrossRefGoogle Scholar
  9. 9.
    Thelwall, M., Kousha, K.: ResearchGate: disseminating, communicating, and measuring scholarship? J. Assoc. Inf. Sci. Technol. 66(5), 876–889 (2015)CrossRefGoogle Scholar
  10. 10.
    Agichtein, E., Castillo, C., Donato, D., Gionis, A., Mishne, G.: Finding high-quality content in social media. In: Proceedings of the 2008 International Conference on Web Search and Data Mining, pp. 183–194. ACM, February 2008Google Scholar
  11. 11.
    Shah, C., Pomerantz, J.: Evaluating and predicting answer quality in community QA. In: Proceedings of the 33rd International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 411–418. ACM, July 2010Google Scholar
  12. 12.
    Liu, Y., Bian, J., Agichtein, E.: Predicting information seeker satisfaction in community question answering. In: Proceedings of the 31st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 483–490. ACM, July 2008Google Scholar
  13. 13.
    Blooma, M.J., Hoe-Lian Goh, D., Yeow-Kuan Chua, A.: Predictors of high-quality answers. Online Inf. Rev. 36(3), 383–400 (2012)CrossRefGoogle Scholar
  14. 14.
    John, B.M., Chua, A.Y.K., Goh, D.H.L.: What makes a high-quality user-generated answer? Internet Comput. IEEE 15(1), 66–71 (2011)CrossRefGoogle Scholar
  15. 15.
    Fu, H., Wu, S., Oh, S.: Evaluating answer quality across knowledge domains: using textual and non-textual features in social Q&A. In: Proceedings of the 78th ASIS&T Annual Meeting: Information Science with Impact: Research in and for the Community, p. 88. American Society for Information Science, November 2015Google Scholar
  16. 16.
    Harper, F.M., Raban, D., Rafaeli, S., Konstan, J.A.: Predictors of answer quality in online Q&A sites. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 865–874, April 2008Google Scholar
  17. 17.
    Fichman, P.: A comparative assessment of answer quality on four question answering sites. J. Inf. Sci. 37(5), 476–486 (2011)CrossRefGoogle Scholar
  18. 18.
    Lee, K.P., Schotland, M., Bacchetti, P., Bero, L.A.: Association of journal quality indicators with methodological quality of clinical research articles. J. Am. Med. Assoc. 287(21), 2805–2808 (2002)CrossRefGoogle Scholar
  19. 19.
    Blake, V.L.P.: The perceived prestige of professional journals, 1995: a replication of the Kohl-Davis study. Educ. Inf. 14, 157–179 (1996)Google Scholar
  20. 20.
    Opthof, T.: Sense and nonsense about the impact factor. Cardiovasc. Res. 33, 1–7 (1997)CrossRefGoogle Scholar
  21. 21.
    Seglen, P.O.: Why the impact factor of journals should not be used for evaluating research. Br. Med. J. 314, 498–502 (1997)CrossRefGoogle Scholar
  22. 22.
    Ugolini, D., Parodi, S., Santi, L.: Analysis of publication quality in a cancer research institute. Scientometrics 38(2), 265–274 (1997)CrossRefGoogle Scholar
  23. 23.
    Mukherjee, B.: Evaluating e-contents beyond impact factor-a pilot study selected open access journals in library and information science. J. Electron. Publishing 10(2) (2007)Google Scholar
  24. 24.
    Calvert, P.J., Zengzhi, S.: Quality versus quantity: contradictions in LIS journal publishing in China. Libr. Manag. 22(4/5), 205–211 (2001)CrossRefGoogle Scholar
  25. 25.
    Watson, C.: An exploratory study of secondary students’ judgments of the relevance and reliability of information. J. Assoc. Inf. Sci. Technol. 65(7), 1385–1408 (2014)CrossRefGoogle Scholar
  26. 26.
    Rieh, S.Y., Danielson, D.R.: Credibility: a multidisciplinary framework. Annu. Rev. Inf. Sci. Technol. 41(1), 307–364 (2007)CrossRefGoogle Scholar
  27. 27.
    Cool, C., Belkin, N., Frieder, O., Kantor, P.: Characteristics of text affecting relevance judgments. In: National Online Meeting. Learned Information (EUROPE) LTD, vol. 14, p. 77, August 1993Google Scholar
  28. 28.
    Park, T.K.: The nature of relevance in information retrieval: an empirical study. Libr. Q. 63, 318–351 (1993)CrossRefGoogle Scholar
  29. 29.
    Barry, C.L.: User-defined relevance criteria: an exploratory study. JASIS 45(3), 149–159 (1994)CrossRefGoogle Scholar
  30. 30.
    Vakkari, P., Hakala, N.: Changes in relevance criteria and problem stages in task performance. J. Documentation 56(5), 540–562 (2000)CrossRefGoogle Scholar
  31. 31.
    Currie, L., Devlin, F., Emde, J., Graves, K.: Undergraduate search strategies and evaluation criteria: searching for credible sources. New Libr. World 111(3/4), 113–124 (2010)CrossRefGoogle Scholar
  32. 32.
    Liu, Z.: Perceptions of credibility of scholarly information on the web. Inf. Process. Manag. 40(6), 1027–1038 (2004)CrossRefzbMATHGoogle Scholar
  33. 33.
    Rieh, S.Y.: Judgment of information quality and cognitive authority in the Web. J. Am. Soc. Inf. Sci. Technol. 53(2), 145–161 (2002)CrossRefGoogle Scholar
  34. 34.
    Arazy, O., Kopak, R.: On the measurability of information quality. J. Am. Soc. Inf. Sci. Technol. 62(1), 89–99 (2011)CrossRefGoogle Scholar
  35. 35.
    Choi, E., Kitzie, V., Shah, C.: Developing a typology of online Q&A models and recommending the right model for each question type. Proc. Am. Soc. Inf. Sci. Technol. 49(1), 1–4 (2012)Google Scholar
  36. 36.
    Krippendorff, K.: Content Analysis: An Introduction to Its Methodology. Sage, Thousand Oaks (2012)Google Scholar
  37. 37.
    Landis, J.R., Koch, G.G.: The measurement of observer agreement for categorical data. Biometrics 33, 159–174 (1977)MathSciNetCrossRefzbMATHGoogle Scholar
  38. 38.
    Cohen, J.: A coefficient for agreement for nominal scales. Educ. Psychol. Measur. 20, 37–46 (1960)CrossRefGoogle Scholar
  39. 39.
    Fleiss, J.L., Cohen, J.: The equivalence of weighted Kappa and the intraclass correlation coefficient as measures of reliability. Educ. Psychol. Measur. 33, 613–619 (1973)CrossRefGoogle Scholar
  40. 40.
    Chua, A.Y., Banerjee, S.: So fast so good: an analysis of answer quality and answer speed in community question-answering sites. J. Am. Soc. Inf. Sci. Technol. 64(10), 2058–2068 (2013)CrossRefGoogle Scholar
  41. 41.
    Kim, S., Oh, S.: Users’ relevance criteria for evaluating answers in a social Q&A site. J. Am. Soc. Inf. Sci. Technol. 60(4), 716–727 (2009)MathSciNetCrossRefGoogle Scholar
  42. 42.
    Fleiss, J.L.: Measuring nominal scale agreement among many raters. Psychol. Bull. 76(5), 378 (1971)CrossRefGoogle Scholar
  43. 43.
    Clyde, L.A.: Evaluating the quality of research publications: a pilot study of school librarianship. J. Am. Soc. Inf. Sci. Technol. 55(13), 1119–1130 (2004)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Department of Information ManagementNanjing University of Science and TechnologyNanjingChina
  2. 2.School of Information SciencesUniversity of PittsburghPittsburghUSA

Personalised recommendations