Skip to main content
Log in

Scientometric analysis of scientific publications in CSCW

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

Over the last decades, CSCW research has undergone significant structural changes and has grown steadily with manifested differences from other fields in terms of theory building, methodology, and socio-technicality. This paper provides a quantitative assessment of the scientific literature for mapping the intellectual structure of CSCW research and its scientific development over a 15-year period (2001–2015). A total of 1713 publications were subjected to examination in order to draw statistics and depict dynamic changes to shed new light upon the growth, spread, and collaboration of CSCW devoted outlets. Overall, our study characterizes top (cited and downloaded) papers, citation patterns, prominent authors and institutions, demographics, collaboration patterns, most frequent topic clusters and keywords, and social mentions by country, discipline, and professional status. The results highlight some areas of improvement for the field and a lot of well-established topics which are changing gradually with impact on citations and downloads. Statistical models reveal that the field is predominantly influenced by fundamental and highly recognized scientists and papers. A small number of papers without citations, the growth of the number of papers by year, and an average number of more than 39 citations per paper in all venues ensure the field a healthy and evolving nature. We discuss the implications of these findings in terms of the influence of CSCW on the larger field of HCI.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Adapted from Cruz et al. (2012)

Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  • Abt, H. A. (2017). Citations and team sizes. Publications of the Astronomical Society of the Pacific, 129(972), 024008.

    Article  Google Scholar 

  • Ackerman, M. S. (2000). The intellectual challenge of CSCW: The gap between social requirements and technical feasibility. Human–Computer Interaction, 15(2), 179–203.

    Article  Google Scholar 

  • Ackerman, M. S., Dachtera, J., Pipek, V., & Wulf, V. (2013). Sharing knowledge and expertise: The CSCW view of knowledge management. Computer Supported Cooperative Work, 22(4–6), 531–573.

    Article  Google Scholar 

  • Aduku, K. J., Thelwall, M., & Kousha, K. (2017). Do Mendeley reader counts reflect the scholarly impact of conference papers? An investigation of computer science and engineering. Scientometrics , 112(1), 573–581.

    Article  Google Scholar 

  • Aguillo, I. F. (2011). Is Google Scholar useful for bibliometrics? A webometric analysis. Scientometrics, 91(2), 343–351.

    Article  Google Scholar 

  • Antunes, P., & Pino, J. A. (2010). A review of CRIWG research. In Proceedings of the international conference on collaboration and technology (pp. 1–15). Berlin: Springer.

  • Archambault, É., Campbell, D., Gingras, Y., & Larivière, V. (2009). Comparing bibliometric statistics obtained from the Web of Science and Scopus. Journal of the American Society for Information Science and Technology, 60(7), 1320–1326.

    Article  Google Scholar 

  • Bannon, L. (1992). Perspectives on CSCW: From HCI and CMC to CSCW. In Proceedings of the international conference on humancomputer interaction (pp. 148–158). St. Petersburg: BCS HICOM.

    Google Scholar 

  • Bannon, L. (1993). CSCW: An initial exploration. Scandinavian Journal of Information Systems, 5(2), 3–24.

    Google Scholar 

  • Bannon, L., & Schmidt, K. (1989). CSCW: Four characters in search of a context. In Proceedings of the first european conference on computer supported cooperative work, Gatwick, London, 1315 September 1989 (pp. 358–372).

  • Barbosa, S. D. J., Silveira, M. S., & Gasparini, I. (2016). What publications metadata tell us about the evolution of a scientific community: The case of the Brazilian Human–Computer Interaction conference series. Scientometrics , 110(1), 275–300.

    Article  Google Scholar 

  • Bar-Ilan, J., Haustein, S., Peters, I., Priem, J., Shema, H., & Terliesner, J. (2012). Beyond citations: Scholars’ visibility on the social web. arXiv:1205.5611.

  • Bar-Ilan, J., Levene, M., & Lin, A. (2007). Some measures for comparing citation databases. Journal of Informetrics, 1(1), 26–34.

    Article  Google Scholar 

  • Barkhuus, L., & Rode, J. A. (2007). From mice to men-24 years of evaluation in CHI. In Proceedings of the ACM SIGCHI conference on human factors in computing systems (pp. 1–16).

  • Bartneck, C. (2011). The end of the beginning: A reflection on the first five years of the HRI conference. Scientometrics, 86(2), 487–504.

    Article  Google Scholar 

  • Bartneck, C., & Hu, J. (2009). Scientometric analysis of the CHI proceedings. In Proceedings of the ACM SIGCHI conference on human factors in computing systems (pp. 699–708).

  • Bartneck, C., & Hu, J. (2010). The fruits of collaboration in a multidisciplinary field. Scientometrics, 85(1), 41–52.

    Article  Google Scholar 

  • Bauer, K., & Bakkalbasi, N. (2005). An examination of citation counts in a new scholarly communication environment. D-Lib Magazine, 11, 9.

    Article  Google Scholar 

  • Beaver, D., & Rosen, R. (1978). Studies in scientific collaboration: Part I. The professional origins of scientific co-authorship. Scientometrics, 1(1), 65–84.

    Article  Google Scholar 

  • Bird, S., Wiles, J. L., Okalik, L., Kilabuk, J., & Egeland, G. M. (2009). Methodological consideration of story telling in qualitative research involving Indigenous Peoples. Global Health Promotion, 16(4), 16–26.

    Article  Google Scholar 

  • Blomberg, J., & Karasti, H. (2013). Reflections on 25 years of ethnography in CSCW. Computer Supported Cooperative Work, 22(4–6), 373–423.

    Article  Google Scholar 

  • Bornmann, L. (2015). Alternative metrics in scientometrics: A meta-analysis of research into three altmetrics. Scientometrics, 103(3), 1123–1144.

    Article  Google Scholar 

  • Bush, G. P., & Hattery, L. H. (1956). Teamwork and creativity in research. Administrative Science Quarterly , 1(3), 361–372.

    Article  Google Scholar 

  • Chen, C., Panjwani, G., Proctor, J., Allendoerfer, K., Aluker, S., Sturtz, D., Vukovic, M., & Kuljis, J. (2005). Visualizing the evolution of HCI. In Proceedings of the international BCS human computer interaction conference (pp. 233–250). London: Springer.

    Google Scholar 

  • Cheng, J., & Bernstein, M. S. (2015). Flock: Hybrid crowd-machine learning classifiers. In Proceedings of the 18th ACM conference on computer supported cooperative work & social computing (pp. 600–611).

  • Convertino, G., Kannampallil, T. G., & Councill, I. (2006). Mapping the intellectual landscape of CSCW research. In Poster presented at ACM conference on computer supported cooperative work, 6.

  • Correia, A., Fonseca, B., & Paredes, H. (2013). Exploiting classical bibliometrics of CSCW: Classification, evaluation, limitations, and the odds of semantic analytics. In Proceedings of the first international conference on human factors in computing and informatics (pp. 137–156). Berlin: Springer.

    Chapter  Google Scholar 

  • Correia A., Fonseca B., Paredes H., Martins P., & Morgado L. (2016). Computer-simulated 3D virtual environments in collaborative learning and training: Meta-review, refinement, and roadmap. In Y. Sivan (Ed.), Handbook on 3D3C Platforms. Progress in IS (pp. 403–440). Cham: Springer.

    Chapter  Google Scholar 

  • Crabtree, A., Rodden, T., & Benford, S. (2005). Moving with the times: IT research and the boundaries of CSCW. Computer Supported Cooperative Work, 14(3), 217–251.

    Article  Google Scholar 

  • Cruz, A., Correia, A., Paredes, H., Fonseca, B., Morgado, L., & Martins, P. (2012). Towards an overarching classification model of CSCW and groupware: A socio-technical perspective. In Proceedings of the 18th international conference on collaboration and technology (pp. 41–56). Berlin: Springer.

  • Diodato, V. P. (1994). Dictionary of bibliometrics. New York: The Haworth Press.

    Google Scholar 

  • Ellis, C. A., Gibbs, S. J., & Rein, G. (1991). Groupware: Some issues and experiences. Communications of the ACM, 34(1), 39–58.

    Article  Google Scholar 

  • Elo, S., & Kyngäs, H. (2008). The qualitative content analysis process. Journal of Advanced Nursing, 62(1), 107–115.

    Article  Google Scholar 

  • Erdt, M., Nagarajan, A., Sin, S. C. J., & Theng, Y. L. (2016). Altmetrics: An analysis of the state-of-the-art in measuring research impact on social media. Scientometrics, 109(2), 1117–1166.

    Article  Google Scholar 

  • Ferraris, C., & Martel, C. (2000). Regulation in groupware: The example of a collaborative drawing tool for young children. In Proceedings of the sixth IEEE international workshop on groupware (pp. 119–127).

  • Fitzpatrick, G., & Ellingsen, G. (2013). A review of 25 years of CSCW research in healthcare: Contributions, challenges and future agendas. Computer Supported Cooperative Work, 22(4–6), 609–665.

    Article  Google Scholar 

  • Garfield, E. (1972). Citation analysis as a tool in journal evaluation. Science, 178(4060), 471–479.

    Article  Google Scholar 

  • Garfield, E. (1979). Is citation analysis a legitimate evaluation tool? Scientometrics, 1(4), 359–375.

    Article  Google Scholar 

  • Glänzel, W. (2009). History of bibliometrics and its present-day tasks in research evaluation. Leuven: Katholieke Universiteit Leuven.

    Google Scholar 

  • Glänzel, W., & Schoepflin, U. (1995). A bibliometric study on ageing and reception processes of scientific literature. Journal of Information Science, 21(1), 37–53.

    Article  Google Scholar 

  • Glänzel, W., & Schubert, A. (2004). Analysing scientific networks through co-authorship. Handbook of Quantitative Science and Technology Research, 11, 257–279.

    Google Scholar 

  • Glass, G. V. (1976). Primary, secondary, and meta-analysis of research. Educational Researcher, 5(10), 3–8.

    Article  Google Scholar 

  • Greenberg, S. (1991). An annotated bibliography of Computer Supported Cooperative Work. ACM SIGCHI Bulletin, 23(3), 29–62.

    Article  Google Scholar 

  • Greif, I. (1988). Computer-supported cooperative work: A book of readings. San Mateo: Morgan Kaufmann.

    Google Scholar 

  • Grudin, J. (1994). Computer-Supported Cooperative Work: History and focus. IEEE Computer, 27(5), 19–26.

    Article  Google Scholar 

  • Grudin, J. (2012). Punctuated equilibrium and technology change. Interactions, 19(5), 62–66.

    Article  Google Scholar 

  • Grudin, J., & Poltrock, S. E. (1997). Computer-Supported Cooperative Work and groupware. Advances in Computers, 45, 269–320.

    Article  Google Scholar 

  • Grudin, J., & Poltrock, S. E. (2012). Taxonomy and theory in computer supported cooperative work. Handbook of organizational psychology (pp. 1323–1348). Oxford: Oxford University Press.

    Google Scholar 

  • Gupta, A. (2015). Five years of IndiaHCI: A scientometric analysis. In Proceedings of the 7th international conference on HCI (IndiaHCI) (pp. 56–61).

  • Haustein, S., Costas, R., & Larivière, V. (2015). Characterizing social media metrics of scholarly papers: The effect of document properties and collaboration patterns. PLoS ONE, 10(3), e0120495.

    Article  Google Scholar 

  • Heffner, A. (1981). Funded research, multiple authorship, and subauthorship collaboration in four disciplines. Scientometrics, 3(1), 5–12.

    Article  Google Scholar 

  • Heilig, L., & Voß, S. (2014). A scientometric analysis of cloud computing literature. IEEE Transactions on Cloud Computing, 2(3), 266–278.

    Article  Google Scholar 

  • Henry, N., Goodell, H., Elmqvist, N., & Fekete, J. D. (2007). 20 Years of four HCI conferences: A visual exploration. International Journal of Human–Computer Interaction, 23(3), 239–285.

    Article  Google Scholar 

  • Hertzel, D. H. (1987). Bibliometrics, history of the development of ideas in. Encyclopedia of Library and Information Science, 42(7), 144–211.

    Google Scholar 

  • Hess, D. J. (1997). Science studies: An advanced introduction. New York: New York University Press.

    Google Scholar 

  • Holsapple, C. W., & Luo, W. (2003). A citation analysis of influences on collaborative computing research. Computer Supported Cooperative Work, 12(3), 351–366.

    Article  Google Scholar 

  • Horn, D. B., Finholt, T. A., Birnholtz, J. P., Motwani, D., & Jayaraman, S. (2004). Six degrees of Jonathan Grudin: A social network analysis of the evolution and impact of CSCW research. In Proceedings of the ACM conference on computer supported cooperative work (pp. 582–591).

  • Hu, Z., & Wu, Y. (2014). Regularity in the time-dependent distribution of the percentage of never-cited papers: An empirical pilot study based on the six journals. Journal of Informetrics, 8(1), 136–146.

    Article  Google Scholar 

  • Hughes, J., King, V., Rodden, T., & Andersen, H. (1994). Moving out from the control room: Ethnography in system design. In Proceedings of the ACM conference on computer supported cooperative Work (pp. 429–439).

  • Hughes, J., Randall, D., & Shapiro, D. (1991). CSCW: Discipline or paradigm. In Proceedings of the second european conference on computer-supported cooperative work (pp. 24–27).

  • Iglič, H., Doreian, P., Kronegger, L., & Ferligoj, A. (2017). With whom do researchers collaborate and why? Scientometrics, 112(1), 153–174.

    Article  Google Scholar 

  • Jacovi, M., Soroka, V., Gilboa-Freedman, G., Ur, S., Shahar, E., & Marmasse, N. (2006). The chasms of CSCW: A citation graph analysis of the CSCW conference. In Proceedings of the 20th anniversary ACM conference on computer supported cooperative work (pp. 289–298).

  • Jacsó, P. (2008). Google Scholar revisited. Online Information Review, 32(1), 102–114.

    Article  Google Scholar 

  • Jacsó, P. (2010). Comparison of journal impact rankings in the SCImago Journal & Country Rank and the Journal Citation Reports databases. Online Information Review, 34(4), 642–657.

    Article  Google Scholar 

  • Jacsó, P. (2012). Google Scholar metrics for publications: The software and content features of a new open access bibliometric service. Online Information Review, 36(4), 604–619.

    Article  Google Scholar 

  • Jain, A. K., & Dubes, R. C. (1988). Algorithms for clustering data. Upper Saddle River: Prentice-Hall Inc.

    MATH  Google Scholar 

  • Jirotka, M., Lee, C. P., & Olson, G. M. (2013). Supporting scientific collaboration: Methods, tools and concepts. Computer Supported Cooperative Work, 22(4–6), 667–715.

    Article  Google Scholar 

  • Johnson, D. P. (2008). Contemporary sociological theory: An integrated multi-level approach. Berlin: Springer.

    Book  Google Scholar 

  • Katz, J. S., & Martin, B. R. (1997). What is research collaboration? Research Policy, 26(1), 1–18.

    Article  Google Scholar 

  • Kaye, J. J. (2009). Some statistical analyses of CHI. In CHI extended abstracts on human factors in computing systems (pp. 2585–2594). New York, NY: ACM.

    Google Scholar 

  • Keegan, B., Horn, D., Finholt, T. A., & Kaye, J. (2013). Structure and dynamics of coauthorship, citation, and impact within CSCW. arXiv:1307.7172.

  • Keele, S. (2007). Guidelines for performing systematic literature reviews in software engineering. EBSE Technical Report, Ver. 2.3.

  • Kienle, A., & Wessner, M. (2006). The CSCL community in its first decade: Development, continuity, connectivity. International Journal of Computer-Supported Collaborative Learning, 1(1), 9–33.

    Article  Google Scholar 

  • Kittur, A., Nickerson, J. V., Bernstein, M., Gerber, E., Shaw, A., Zimmerman, J., Lease, M., & Horton, J. (2013). The future of crowd work. In Proceedings of the ACM conference on computer supported cooperative work (pp. 1301–1318).

  • Kling, R. (1991). Cooperation, coordination and control in computer-supported work. Communications of the ACM, 34(12), 83–88.

    Article  Google Scholar 

  • Krasner, H., & Greif, I. (1986). CSCW’86: Proceedings. In Proceedings of the first conference on computer-supported cooperative work, 35 December 1986, Austin, Texas.

  • Krippendorff, K. (1980). Content analysis: An introduction to its methodology. London: The Sage Commtext Series, Sage Publications Ltd.

    MATH  Google Scholar 

  • Kumar, S. (2014). Author productivity in the field Human Computer Interaction (HCI) research. Annals of Library and Information Studies, 61(4), 273–285.

    Google Scholar 

  • Kuutti, K. (1991). The concept of activity as a basic unit of analysis for CSCW research. In Proceedings of the second European conference on computer-supported cooperative work (pp. 249–264).

  • Lee, S., & Bozeman, B. (2005). The impact of research collaboration on scientific productivity. Social Studies of Science, 35(5), 673–702.

    Article  Google Scholar 

  • Lee, C. P., Dourish, P., & Mark, G. (2006). The human infrastructure of cyberinfrastructure. In Proceedings of the 20th anniversary ACM conference on computer supported cooperative work (pp. 483–492).

  • Lee, H. E., Park, J. H., & Song, Y. (2014). Research collaboration networks of prolific institutions in the HCI field in Korea: An analysis of the HCI Korea conference proceedings. In Proceedings of HCI Korea (pp. 434–441).

  • Li, X., Thelwall, M., & Giustini, D. (2011). Validating online reference managers for scholarly impact measurement. Scientometrics, 91(2), 461–471.

    Article  Google Scholar 

  • Liu, Y., Goncalves, J., Ferreira, D., Xiao, B., Hosio, S., & Kostakos, V. (2014). CHI 1994–2013: Mapping two decades of intellectual progress through co-word analysis. In Proceedings of the 32nd annual ACM conference on human factors in computing systems (pp. 3553–3562).

  • Malone, T. W., & Crowston, K. (1994). The interdisciplinary study of coordination. ACM Computing Surveys (CSUR), 26(1), 87–119.

    Article  Google Scholar 

  • Manten, A. A. (1970). Statistical analysis of a scientific discipline: Palynology. Earth-Science Reviews, 6(3), 181–218.

    Article  Google Scholar 

  • Mao, J., Cao, Y., Lu, K., & Li, G. (2017a). Topic scientific community in science: A combined perspective of scientific collaboration and topics. Scientometrics, 112(2), 851–875.

    Article  Google Scholar 

  • Mao, K., Capra, L., Harman, M., & Jia, Y. (2017b). A survey of the use of crowdsourcing in software engineering. Journal of Systems and Software, 126, 57–84.

    Article  Google Scholar 

  • McGrath, J. E. (1984). Groups: Interaction and performance. Englewood Cliffs, NJ: Prentice-Hall.

    Google Scholar 

  • Meho, L. I., & Rogers, Y. (2008). Citation counting, citation ranking, and h-index of Human–Computer Interaction researchers: A comparison of Scopus and Web of Science. Journal of the American Society for Information Science and Technology, 59(11), 1711–1726.

    Article  Google Scholar 

  • Meho, L. I., & Yang, K. (2007). Impact of data sources on citation counts and rankings of LIS faculty: Web of Science versus Scopus and Google Scholar. Journal of the American Society for Information Science and Technology, 58(13), 2105–2125.

    Article  Google Scholar 

  • Melin, G., & Persson, O. (1996). Studying research collaboration using co-authorships. Scientometrics, 36(3), 363–377.

    Article  Google Scholar 

  • Mentzas, G. N. (1993). Coordination of joint tasks in organizational processes. Journal of Information Technology, 8(3), 139.

    Article  Google Scholar 

  • Mikki, S. (2009). Google Scholar compared to Web of Science. A literature review. Nordic Journal of Information Literacy in Higher Education, 1(1), 41–51.

    Article  Google Scholar 

  • Mingers, J., & Leydesdorff, L. (2015). A review of theory and practice in scientometrics. European Journal of Operational Research, 246(1), 1–19.

    Article  MATH  Google Scholar 

  • Mittleman, D., Briggs, R., Murphy, J., & Davis, A. (2008). Toward a taxonomy of groupware technologies. In Proceedings of the 14th international workshop on groupware: Design, implementation, and use (pp. 305–317).

  • Mubin, O., Al Mahmud, A., & Ahmad, M. (2017). HCI down under: Reflecting on a decade of the OzCHI conference. Scientometrics, 112(1), 367–382.

    Article  Google Scholar 

  • Mulrow, C. D. (1994). Rationale for systematic reviews. British Medical Journal, 309(6954), 597.

    Article  Google Scholar 

  • Nalimov, V. V., & Mulchenko, B. M. (1969). Scientometrics. Studies of science as a process of information. Moscow: Science.

    Google Scholar 

  • Narin, F., Stevens, K., & Whitlow, E. (1991). Scientific co-operation in Europe and the citation of multinationally authored papers. Scientometrics, 21(3), 313–323.

    Article  Google Scholar 

  • Neuhaus, C., Neuhaus, E., Asher, A., & Wrede, C. (2006). The depth and breadth of Google Scholar: An empirical study. Portal: Libraries and the Academy, 6(2), 127–141.

    Article  Google Scholar 

  • Nichols, D. M., & Cunningham, S. J. (2015). A scientometric analysis of 15 years of CHINZ conferences. In Proceedings of the 15th New Zealand conference on humancomputer interaction (pp. 73–80).

  • Oulasvirta, A. (2006). A bibliometric exercise for SIGCHI conference on human factors in computing systems. Retrieved November, 2016 from http://www.hiit.fi/node/290.

  • Padilla, S., Methven, T. S., & Chantler, M. J. (2014). Is British HCI important? A topic-based comparison with CHI. In Proceedings of the 28th international BCS human computer interaction conference on HCI (pp. 365–370).

  • Panciera, K., Halfaker, A., & Terveen, L. (2009). Wikipedians are born, not made: A study of power editors on Wikipedia. In Proceedings of the ACM international conference on supporting group work (pp. 51–60).

  • Peters, H., & Van Raan, A. (1991). Structuring scientific activities by co-author analysis: An exercise on a university faculty level. Scientometrics, 20(1), 235–255.

    Article  Google Scholar 

  • Pinelle, D., & Gutwin, C. (2000). A review of groupware evaluations. In Proceedings of the IEEE 9th international workshops on enabling technologies: Infrastructure for collaborative enterprises (pp. 86–91).

  • Piwowar, H. (2013). Altmetrics: Value all research products. Nature, 493(7431), 159.

    Google Scholar 

  • Pope, C., Ziebland, S., & Mays, N. (2000). Qualitative research in health care: Analysing qualitative data. British Medical Journal, 320(7227), 114.

    Article  Google Scholar 

  • Price, D. S. (1963). Little science, big science. New York City: Columbia University Press.

    Google Scholar 

  • Price, D. S. (1976). A general theory of bibliometric and other cumulative advantage processes. Journal of the American Society for Information Science, 27(5), 292–306.

    Article  Google Scholar 

  • Priem, J., & Hemminger, B. H. (2010). Scientometrics 2.0: New metrics of scholarly impact on the social Web. First Monday, 15(7).

  • Pritchard, A. (1969). Statistical bibliography or bibliometrics. Journal of Documentation, 25, 348.

    Google Scholar 

  • Pumareja, D., & Sikkel, K. (2002). An evolutionary approach to groupware implementation: The context of requirements engineering in the socio-technical frame. CTIT Technical Reports Series, 0230(30), 1–27.

    Google Scholar 

  • Qi, M., Zeng, A., Li, M., Fan, Y., & Di, Z. (2017). Standing on the shoulders of giants: The effect of outstanding scientists on young collaborators’ careers. Scientometrics, 111(3), 1839–1850.

    Article  Google Scholar 

  • Rolland, B., Paine, D., & Lee, C. P. (2014). Work practices in coordinating center enabled networks (CCENs). In Proceedings of the 18th ACM international conference on supporting group work (pp. 194–203).

  • Sandelowski, M. (1995). Sample size in qualitative research. Research in Nursing & Health, 18(2), 179–183.

    Article  Google Scholar 

  • Schmidt, K. (2011). The concept of ‘work’ in CSCW. Computer Supported Cooperative Work, 20(4–5), 341–401.

    Article  Google Scholar 

  • Schmidt, K., & Bannon, L. (1992). Taking CSCW seriously. Computer Supported Cooperative Work, 1(1–2), 7–40.

    Article  Google Scholar 

  • Schmidt, K., & Bannon, L. (2013). Constructing CSCW: The first quarter century. Computer Supported Cooperative Work, 22(4–6), 345–372.

    Article  Google Scholar 

  • Stapi, Z., De-Marcos, L., Strahonja, V., García-Cabot, A., & López, E. G. (2016). Scrutinizing systematic literature review process in software engineering. TEM JOURNAL: Technology, Education, Management, Informatics, 5(1), 104–116.

    Google Scholar 

  • Suchman, L. (1989). Notes on computer support for cooperative work. Working paper WP-12, University of Jyväskylä, Department of Computer Science.

  • Sugimoto, C. R., Work, S., Larivière, V., & Haustein, S. (2017). Scholarly use of social media and altmetrics: A review of the literature. Journal of the Association for Information Science and Technology, 68(9), 2037–2062.

    Article  Google Scholar 

  • Tague-Sutcliffe, J. (1992). An introduction to informetrics. Information Processing and Management, 28(1), 1–3.

    Article  Google Scholar 

  • Tang, K. Y., Tsai, C. C., & Lin, T. C. (2014). Contemporary intellectual structure of CSCL research (2006–2013): A co-citation network analysis with an education focus. International Journal of Computer-Supported Collaborative Learning, 9(3), 335–363.

    Article  Google Scholar 

  • Thelwall, M., Haustein, S., Larivière, V., & Sugimoto, C. R. (2013). Do altmetrics work? Twitter and ten other social web services. PLoS ONE, 8(5), e64841.

    Article  Google Scholar 

  • Thelwall, M., & Kousha, K. (2017). ResearchGate versus Google Scholar: Which finds more early citations?. Scientometrics, 112(2), 1125–1131.

    Article  Google Scholar 

  • Thelwall, M., & Wilson, P. (2016). Mendeley readership altmetrics for medical articles: An analysis of 45 fields. Journal of the Association for Information Science and Technology, 67(8), 1962–1972.

    Article  Google Scholar 

  • Van den Besselaar, P., & Heimeriks, G. (2006). Mapping research topics using word-reference co-occurrences: A method and an exploratory case study. Scientometrics, 68(3), 377–393.

    Article  Google Scholar 

  • Van Raan, A. (1997). Scientometrics: State-of-the-art. Scientometrics, 38(1), 205–218.

    Article  Google Scholar 

  • Wade, N. (1975). Citation analysis: A new tool for science administrators. Science, 188(4187), 429–432.

    Article  Google Scholar 

  • Wagner, C. S., Whetsell, T., & Leydesdorff, L. (2016). Growth of international cooperation in science: Revisiting six case studies. arXiv:1612.07208.

  • Wainer, J., & Barsottini, C. (2007). Empirical research in CSCW – a review of the ACM/CSCW conferences from 1998 to 2004. Journal of the Brazilian Computer Society, 13(3), 27–35.

    Article  Google Scholar 

  • Wallace, J. R., Oji, S., & Anslow, C. (2017). Technologies, methods, and values: Changes in empirical research at CSCW 19902015. UWSpace. http://hdl.handle.net/10012/12396.

  • Wang, W., Yu, S., Bekele, T. M., Kong, X., & Xia, F. (2017). Scientific collaboration patterns vary with scholars’ academic ages. Scientometrics, 112(1), 329–343.

    Article  Google Scholar 

  • Wania, C. E., Atwood, M. E., & McCain, K. W. (2006). How do design and evaluation interrelate in HCI research? In Proceedings of the 6th conference on designing interactive systems (pp. 90–98).

  • Zahedi, Z., Costas, R., & Wouters, P. (2014). How well developed are altmetrics? a cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications. Scientometrics, 101(2), 1491–1513.

    Article  Google Scholar 

  • Ziman, J., & Schmitt, R. W. (1995). Prometheus bound: Science in a dynamic steady state. American Journal of Physics, 63(5), 476–477.

    Article  Google Scholar 

  • Zuckerman, H. (1987). Citation analysis and the complex problem of intellectual influence. Scientometrics, 12(5–6), 329–338.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to António Correia.

Appendix

Appendix

Comparison of data sources for citation analysis

Some preliminary work has described GS as a freely available service that covers more publications than the “very expensive, subscription-based” WoS and Scopus databases. These sources constitute useful and scientifically valid data sources for bibliometric analysis (Mikki 2009) but only comprise citations related to journal papers and conference proceedings. Comparatively, GS additionally indexes book chapters, books, theses and dissertations, workshops, technical reports, amongst a vast set of scholarly documents. GS allows a researcher with limited access to commercial databases perform a bibliometric exercise without geographic or linguistic barriers (Meho and Rogers 2008). It is also worth mentioning that some advantages of the GS algorithm can be found on document detection and filtering, free access to files and websites of institutions and researchers, and indexing of the same set of documents covered by proprietary databases. In this context, GS can be “perhaps one of the largest scientific bibliographic databases” (Aguillo 2011) with fast and very broadly full text search capabilities (Jacsó 2012). Notwithstanding these advantages, a lack of control over its contents defines this service as a “noisy database” that implies a complex and time-consuming data cleaning task for evaluation purposes. GS negative aspects are mainly associated with its software resources, including an inadequate clustering of identical citations that result in duplications, inability to detect all authors, inflated counting of citations, and other problems caused by automatic indexing (Jacsó 2008). Its search engine is designed to return only the most significant results, showing a low degree of control to search systematically (Mikki 2009).

Comparing with WoS, GS results resemble this subscription-based tool while covering a broader universe of metadata for multi-language documents, while books receive high citation rates. GS extracts citations automatically from reference lists, whilst citation data is manually handled to some extent in the WoS database (Mikki 2009). Some older papers (not published in the WWW) are not susceptible to indexation by GS (Neuhaus et al. 2006). In addition, a lower index of WoS for books and conference papers can be limitative for bibliometrics. This is particularly noted in the sample chosen in this study, where only GS provided data for all publications. Looking at the results provided by Meho and Yang (2007), all these services are valuable for performing bibliometrics studies with a small overlap in citations and an analogous operation mode. GS and WoS rank groups of scholars in a similar way, and both services present speediness in the searching process. GS provides more citations than WoS and Scopus, identifying a higher number of unique citations that can be helpful for presenting evidence on broader intellectual and international impacts. In addition, Bauer and Bakkalbasi (2005) did not find significant differences between WoS and Scopus. A positive relationship between their rank was documented by Archambault et al. (2009) arguing that “the outputs (papers) and impacts (citations) of countries obtained from the two databases are extremely correlated” despite content and coverage differences in terms of scope and volume. Criticisms in the literature against Scopus have been based on its systematic coverage for Elsevier’s journals, presenting only citation data related to 1996 and beyond (Aguillo 2011). Despite some findings suggesting that, for HCI, more valid citation data can be achievable by using Scopus than WoS (Meho and Rogers 2008), depth and length of coverage have been disappointing for various Scopus journals. Furthermore, Scopus comprises a short time span and critical gaps underlying the coverage of lower-quality publications (Jacsó 2012).

The ACM-DL contains an archive with over 400,000 full-text articles and more than 18,000 new full-text entries added each year, ranging from journals and technical magazines to conference proceedings published by the Association for Computing Machinery. In addition, ACM-DL also provides easy access to bibliometric data (e.g., citation count) and altmetrics (e.g., number of downloads). Recently, SpringerLink also introduced altmetrics to measure the scientific impact of its covered publications on social media. Both services remain updated and provide mechanisms for information seeking. Another comparison between GS and ResearchGate revealed that the last one “found less citations than did Google Scholar but more than both Web of Science and Scopus” (Thelwall and Kousha 2017). At the same time, Mendeley was characterized as an “useful tool for tracking the impact of both conference papers and journal articles in computer science” (Aduku et al. 2017).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Correia, A., Paredes, H. & Fonseca, B. Scientometric analysis of scientific publications in CSCW. Scientometrics 114, 31–89 (2018). https://doi.org/10.1007/s11192-017-2562-0

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-017-2562-0

Keywords

Navigation