Analysis of the Quality of Academic Papers by the Words in Abstracts

  • Tetsuya Nakatoh
  • Kenta Nagatani
  • Toshiro Minami
  • Sachio Hirokawa
  • Takeshi Nanri
  • Miho Funamori
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10274)

Abstract

The investigation of related research is very important for research activities. However, it is not easy to choose an appropriate and important academic paper from among the huge number of possible papers. The researcher searches by combining keywords and then selects an paper to be checked because it uses an index that can be evaluated. The citation count is commonly used as this index, but information about recently published papers cannot be obtained. This research attempted to identify good papers using only the words included in the abstract. We constructed a classifier by machine learning and evaluated it using cross validation. As a result, it was found that a certain degree of discrimination is possible.

Keywords

Bibliometrics Research investigation SVM Citation 

References

  1. 1.
    Garfield, E.: Citation indexes for science: a new dimension in documentation through association of ideas. Science 122(3159), 108–111 (1955)CrossRefGoogle Scholar
  2. 2.
    Garfield, E., Sher, I.H., Torpie, R.J.: The Use of Citation Data in Writing the History of Science. Institute for Scientific Information, Philadelphia (1964)Google Scholar
  3. 3.
    Garfield, E.: The history and meaning of the journal impact factor. J. Am. Med. Assoc. 295(1), 90–93 (2006)CrossRefGoogle Scholar
  4. 4.
    Hirsch, J.E.: An index to quantify an individual’s scientific research output. Proc. Natl. Acad. Sci. USA 102(46), 16569–16572 (2005)CrossRefMATHGoogle Scholar
  5. 5.
    Kostoff, R.N.: Performance measures for government-sponsored research: overview and background. Scientometrics 36(3), 281–292 (1996)CrossRefGoogle Scholar
  6. 6.
    Marshakova-Shaikevich, I.: The standard impact factor as an evaluation tool of science fields and scientific journals. Scientometrics 35(2), 283–290 (1996)CrossRefGoogle Scholar
  7. 7.
    Martin, B.R.: The use of multiple indicators in the assessment of basic research. Scientometrics 36(3), 343–362 (1996)CrossRefGoogle Scholar
  8. 8.
    Nakatoh, T., Hirokawa, S., Minami, T., Nanri, T., Funamori, M.: Assessing the significance of scholarly articles using their attributes. In: 22nd International Symposium on Artificial Life and Robotics (AROB 2017), pp. 742–746 (2017)Google Scholar
  9. 9.
    Nakatoh, T., Nakanishi, H., Hirokawa, S.: Journal impact factor revised with focused view. In: 7th KES International Conference on Intelligent Decision Technologies (KES-IDT 2015), pp. 471–481 (2015)Google Scholar
  10. 10.
    Nakatoh, T., Nakanishi, H., Baba, K., Hirokawa, S.: Focused citation count: a combined measure of relevancy and quality. In: IIAI 4th International Congress on Advanced Applied Informatics (IIAI AAI 2015), pp. 166–170 (2015)Google Scholar
  11. 11.
    Newman, M.E.J.: The structure of scientific collaboration networks. Proc. Natl. Acad. Sci. USA 98(2), 404–409 (2001)MathSciNetCrossRefMATHGoogle Scholar
  12. 12.
    Wuchty, S., Jones, B.F., Uzzi, B.: The increasing dominance of teams in production of knowledge. Science 316(5827), 1036–1039 (2007)CrossRefGoogle Scholar
  13. 13.
    Ashok, V.G., Feng, S., Choi, Y.: Success with style: using writing style to predict the success of novels. In: 2013 Conference on Empirical Methods in Natural Language Processing, pp. 1753–1764 (2013)Google Scholar
  14. 14.
    Otani, S., Tomiura, Y.: Extraction of key expressions indicating the important sentence from article abstracts. In: IIAI 3rd International Conference on Advanced Applied Informatics, pp. 216–219 (2014)Google Scholar
  15. 15.
    Zahedi, Z., Costas, R., Wouters, P.: How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications. Scientometrics 101(2), 1491–1513 (2014)CrossRefGoogle Scholar
  16. 16.
    Schulte, J.: Publications on experimental physical methods to investigate ultra high dilutions – an assessment on quality. Homeopathy 104(4), 311–315 (2015)CrossRefGoogle Scholar
  17. 17.
    Zorin, N.A., Nemtsov, A.V., Kalinin, V.V.: Formalised assessment of publication quality in Russian psychiatry. Scientometrics 52(2), 315–322 (2001)CrossRefGoogle Scholar
  18. 18.
    Dasi, F., Navarro-García, M.M., Jiménez-Heredia, M., Magraner, J., Viña, J.R., Pallardó, F.V., Cervantes, A., Morcillo, E.: Evaluation of the quality of publications on randomized clinical trials using the Consolidated Standards of Reporting Trials (CONSORT) statement guidelines in a Spanish tertiary hospital. J. Clin. Pharmacol. 52(7), 1106–1114 (2012)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Tetsuya Nakatoh
    • 1
  • Kenta Nagatani
    • 2
  • Toshiro Minami
    • 3
  • Sachio Hirokawa
    • 1
  • Takeshi Nanri
    • 1
  • Miho Funamori
    • 4
  1. 1.Research Institute for Information TechnologyKyushu UniversityNishi-kuJapan
  2. 2.Graduate School and Faculty of Information Science and Electrical EngineeringKyushu UniversityFukuokaJapan
  3. 3.Kyushu Institute of Information SciencesFukuokaJapan
  4. 4.National Institute of InformaticsTokyoJapan

Personalised recommendations