Abstract
This study explores the patterns of exchange of research knowledge among Education Research, Cognitive Science, and what we call “Border Fields.” We analyze a set of 32,121 articles from 177 selected journals, drawn from five sample years between 1994 and 2014. We profile the references that those articles cite, and the papers that cite them. We characterize connections among the fields in sources indexed by Web of Science (WoS) (e.g., peer-reviewed journal articles and proceedings), and connections in sources that are not (e.g., conference talks, chapters, books, and reports). We note five findings—first, over time the percentage of Education Research papers that extensively cite Cognitive Science has increased, but the reverse is not true. Second, a high percentage of Border Field papers extensively cite and are cited by the other fields. Border Field authors’ most cited papers overlap those most cited by Education Research and Cognitive Science. There are fewer commonalities between Educational research and Cognitive Science most cited papers. This is consistent with Border Fields being a bridge between fields. Third, over time the Border Fields have moved closer to Education Research than to Cognitive Science, and their publications increasingly cite, and are cited by, other Border Field publications. Fourth, Education Research is especially strongly represented in the literature published outside those WoS-indexed publications. Fifth, the rough patterns observed among these three fields when using a more restricted dataset drawn from the WoS are similar to those observed with the dataset lying outside the WoS, but Education Research shows a far heavier influence than would be indicated by looking at WoS records alone.
Similar content being viewed by others
Notes
Just noted that Volume 117, Issue 3, of Scientometrics, contains some 44 papers, in contrast to Volume 1, Issue 1, of 40 years ago, with 7 papers.
We acknowledge that this subsumes differences among ED research communities. For example, Discipline-Based Education Research (DBER) concerns with undergraduate Physics education would not necessarily be unified with, say, Medical Education Research, no less with K-12 teaching or Special Education studies.
To illustrate, here are three sample cited references appearing in a WoS abstract record:
-
Honda H., 1990, ATTENTION PERFORM, V13, P567;
-
Cohen A., 1990, LANGUAGE LEARNING
-
Derwing TM, 1989, LANG LEARN, V39, P157, https://doi.org/10.1111/j.1467-1770.1989.tb00420.x.
Thus, WoS provides first author, publication year, and abbreviated source name (plus volume and first page). Note that Cited Reference content varies somewhat in format and what is included (e.g., one of these three has a DOI).
-
These text analyses are done using VantagePoint software (www.theVantagePoint.com).
I.e., from other fields, such as chemistry, sociology, or whatever—these are identified via the second stage process (next paragraph), using a thesaurus that associates sources (journals) to WCs.
WoS Helpdesk notes factors considered in source assignment to WCs: journal subject matter and scope; author & editorial board affiliations; funding acknowledgements; cited and citing relationships to other journals; journal sponsor; journal’s categorizations in other bibliographic databases.
Cited Reference Count per article averages 42.8 (median 39) overall. In 1994 that was 34.5 (median of 31) for 3679 articles; in 2014, it was up by almost 50% from 1994 to 50.9 (median of 46) for 10,086 articles.
Interestingly, citation propensity differs only modestly by (Cited Reference) field:
-
43.7 (median 40) for 20,843 ED articles
-
46.3 (median 42) for 26,575 CogSci articles;
-
47.7 (median 44) for 18,997 Border Field papers; and
-
44.0 (median 40) 30,435 Other articles.
-
In contrast, Youtie et al. (2017) used a more selective denominator of only the cited journal items appearing in the 177 journals and categorized into ED, CogSci, or Border (i.e., not drawing upon the much wider set of cited sources herein classified, and not including “Other”). We note this to explain why values differ (although our foci here are somewhat different).
Somewhat surprising, we found considerable inconsistency in how cited names appeared. Given the size of the files (hundreds of thousands of authors), full cleaning was not done. We applied VantagePoint’s “person.fuz” fuzzy matching routine to consolidate name variations. We also searched for name variants for the top authors. For instance, Michael I. Posner is the most cited by the Gen0 CogSci articles; we collapsed these variants of his name to get his count: Posner, M I; Posner M. I.; Posner M.; Posner, M; Posner M., I; Posner M, I; Posner Michael I.; Posner M.I. Table 4 offers full names where available.
Those authors highly cited by ED Gen0 papers include a number of organizations—in contrast to those cited by CogSci and Border Field papers. This is not a problem for our purpose of comparing top sources, but remember that these are apt to include materials generated by multiple authors (e.g., different National Academies committees). The data only denote cited first authors.
We did not want to average in percentages based on 1 or 2 categorized papers—e.g., if a single citation were in, say, CogSci, to count as 100%.
This set of categorized reference sources is smaller than the full set used in the prior analyses (i.e., generating Figs. 1, 2, 3). Those analyses used all reference sources categorized collectively by the three thesauri together. That allowed for a given cited source to be classified into multiple categories if, say, the 177 thesaurus located it in a Border Field whereas the Auto-categorizer also associated it with ED. Classifying WoS sources is best done based on WoS Categorization of sources (journals); so the Auto-categorizer is only applied to sources not covered by the first two thesauri. See the Supplemental Materials for details.
Should others wish to pursue, we are glad to share the data with finer categorizations, as licensing permits.
References
Abramo, G., D’Angelo, C. A., & Di Costa, F. (2009). Research collaboration and productivity: Is there correlation? Higher Education, 57(2), 155–171.
Anderson, J. R. (2002). Spanning seven orders of magnitude: A challenge for cognitive modeling. Cognitive Science, 26(1), 85–112.
Bechara, A., Damasio, A. R., Damasio, H., & Anderson, S. W. (1994). Insensitivity to future consequences following damage to human prefrontal cortex. Cognition, 50(1–3), 7–15.
Bordons, M., & Barrigon, S. (1992). Bibliometric analysis of publications of Spanish pharmacologists in the SCI (1984–89).2. Contribution to subfields other than pharmacology and pharmacy (ISI). Scientometrics, 25(3), 425–446.
Bornmann, L. (2017). Measuring impact in research evaluations: A thorough discussion of methods for, effects of and problems with impact measurements. Higher Education, 73(5), 775–787. https://doi.org/10.1007/s10734-016-9995-x.
Borrego, M., & Bernhard, J. (2011). The emergence of engineering Education Research as an internationally connected field of inquiry. Journal of Engineering Education, 100(1), 14–47. https://doi.org/10.1002/j.2168-9830.2011.tb00003.x.
Bransford, J. D., Brown, A. L., & Cocking, R. R. (1999). How people learn: Brain, mind, experience, and school. Washington, DC: National Academy Press.
Bruer, J. T. (1997). Education and the brain: A bridge too far. Educational Researcher, 26(8), 4–16.
Budd, J. M. (1988). A bibliometric analysis of higher education literature. Research in Higher Education, 28(2), 180–190. https://doi.org/10.1007/BF00992890.
Burt, R. S. (2004). Structural holes and good ideas. American Journal of Sociology, 110(2), 349–399.
Carley, S., Solomon, G., Youtie, J., & Porter, A. L. (2016). The credibility of policy reporting across learning disciplines: A case study of ‘How People Learn’. In American Evaluation Association meeting, Atlanta.
Chen, C. (2003). Mapping scientific frontiers: The quest for knowledge visualization. London: Springer.
Collins, H., Evans, R., & Gorman, M. (2007). Trading zones and interactional expertise. Studies in History and Philosophy of Science Part A, 38(4), 657–666.
De Bellis, N. (2009). Bibliometrics and citation analysis. Lanham, MD: Scarecrow Press.
Fernandez, A., & Bueno, A. (2006). Synthesizing scientometric patterns in Spanish educational research. Scientometrics, 46(2), 349–367. https://doi.org/10.1007/bf02464783.
Frodeman, R., Klein, J. T., & Mitcham, C. (2010). The Oxford handbook of interdisciplinarity. Oxford: Oxford University Press.
Garfield, E. (1979). Is citation analysis a legitimate evaluation tool? Scientometrics, 1(4), 359–375. https://doi.org/10.1007/bf02019306.
Goldstone, R. L., & Leydesdorff, L. (2006). The import and export of cognitive science. Cognitive Science, 30(6), 983–993.
Hicks, D. (1999). The difficulty of achieving full coverage of international social science literature and the bibliometric consequences. Scientometrics, 44(2), 193–215.
Holbrook, J. B. (2013). What is interdisciplinary communication? Reflections on the very idea of disciplinary integration. Synthese, 190(11), 1865–1879.
Jimenez-Fanjul, N., Maz-Machado, A., & Bracho-Lopez, R. (2013). Bibliometric analysis of the mathematics education journals in the SSCI. International Journal of Research in Social Sciences. http://www.ijsk.org/uploads/3/1/1/7/3117743/3_social_journals.pdf.
Kirby, J. A., Hoadley, C. M., & Carr-Chellman, A. (2005). Instructional systems design and the learning sciences: A citation analysis. Educational Technology Research and Development, 53, 37–47.
Klavans, R., & Boyack, K. W. (2009). Toward a consensus map of science. Journal of the American Society for Information Science and Technology, 60(3), 455–476.
Kosmuetzky, A., & Kruecken, G. (2014). Growth or steady state? A bibliometric focus on international comparative higher Education Research. Higher Education, 67(4), 457–472.
Kwon, S., Solomon, G. E. A., Youtie, J., & Porter, A. L. (2017). A measure of interdisciplinary knowledge flow between specific fields: Implications of interdisciplinarity for impact and funding. PLoS ONE. https://doi.org/10.1371/journal.pone.0185583.
Leydesdorff, L., & Rafols, I. (2009). A global map of science based on the ISI Subject Categories. Journal of the American Society for Information Science and Technology, 60(2), 348–362.
Macauley, P., Evans, T., Pearson, M., & Tregenza, K. (2007). Using digital data and bibliometric analysis for researching doctoral education. Higher Education, 24(2), 189–199. https://doi.org/10.1080/07294360500063076.
Martin, S., Diaz, G., Sancristobal, E., Gil, R., Castro, M., & Peire, J. (2011). New technology trends in education: Seven years of forecasts and convergence. Computers & Education, 57(3), 1893–1906. https://doi.org/10.1016/j.compedu.2011.04.003.
Porter, A. L., & Rafols, I. (2009). Is science becoming more interdisciplinary? Measuring and mapping six research fields over time. Scientometrics, 81(3), 719–745. https://doi.org/10.1007/s11192-008-2197-2.
Porter, A. L., Roessner, J. D., & Heberger, A. E. (2008). How interdisciplinary is a given body of research? Research Evaluation, 17(4), 273–282.
Porter, A. L., Schoeneck, D. J., Roessner, D., & Garner, J. (2010). Practical research proposal and publication profiling. Research Evaluation, 19(1), 29–44.
Porter, A. L., Schoeneck, D. J., Solomon, G., Lakhani, H., & Dietz, J. (2013). Measuring and mapping interdisciplinarity: Research & evaluation on education in science & engineering (“REESE”) and STEM. In American Education Research Association annual meeting, San Francisco.
Price, D. (1963). Little science, big science. New York: Columbia University Press.
Price, D. J. D. (1965). Networks of scientific papers. Science, 149, 510–515.
Roessner, D., Porter, A. L., Nersessian, N. J., & Carley, S. (2013). Validating indicators of interdisciplinarity: Linking bibliometric measures to studies of engineering research labs. Scientometrics, 94(2), 439–468.
Schmidt, E. K., & Graversen, E. K. (2018). Persistent factors facilitating excellence in research environments. Higher Education, 75(2), 341–363.
Schunn, C. D., Crowley, K., & Okada, T. (1998). The growth of multidisciplinarity in the Cognitive Science Society. Cognitive Science, 22(1), 107–130.
Shimelis, G. A., & Rorissa, A. (2013). A bibliometric mapping of the structure of STEM education using co-word anlaysis. Journal of the Association for Information Science and Technology, 64(12), 2513–2536. https://doi.org/10.1002/asi.22917.
Small, H., & Griffith, B. C. (1974). Structure of scientific literatures: 1. Identifying and graphing specialties. Science Studies, 4(1), 17–40.
Solomon, G. E. A., et al. (2014). Who influences whom: The effects of disciplinary background and affiliation on the diffusion of knowledge in the REESE program. In Annual meeting of the American Education Research Association, Philadelphia, PA.
Steinhardt, I., Schneijderberg, C., Gotze, N., Baumann, J., & Krucken, G. (2017). Mapping the quality assurance of teaching and learning in higher education: The emergence of a specialty? Higher Education, 74(2), 221–237.
Stirling, A. (2007). A general framework for analysing diversity in science, technology and society. Journal of The Royal Society, 4(15), 707–719.
Tushman, M. (2002). Special boundary roles in the innovation process. Administrative Science Quarterly, 22, 587–605.
Xian, H., & Madhavan, K. (2014). Anatomy of scholarly collaboration in engineering education: A big-data bibliometric analysis. Journal of Engineering Education, 103(3), 486–514.
Youtie, J., Melkers, J., & Kay, L. (2013). Bibliographic coupling and network analysis to assess knowledge coalescence in a research center environment. Research Evaluation, 22(3), 145–156.
Youtie, J., Solomon, G. E., Carley, S., Kwon, S., & Porter, A. L. (2017). Crossing borders: A citation analysis of connections between cognitive science and educational research… and the fields in between. Research Evaluation, 26(3), 242–255. https://doi.org/10.1093/reseval/rvx020.
Zawacki-Richter, O., Anderson, T., & Tuncay, N. (2010). The growing impact of open access distance education journals: A bibliometric analysis. International Journal of E-learning & Distance Education. http://www.ijede.ca/index.php/jde/article/view/661.
Zoller, F. A., Zimmerling, E., & Boutellier, R. (2014). Assessing the impact of the funding environment on researchers’ risk aversion: The use of citation statistics. Higher Education, 68(3), 333–345. https://doi.org/10.1007/s10734-014-9714-4.
Acknowledgements
This work was supported by a grant from the US National Science Foundation, Directorate for Education and Human Resources (DRL-1348765) to Search Technology Inc. While serving at the National Science Foundation, G.S. was supported by the IR/D program. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect those of the National Science Foundation.
Author information
Authors and Affiliations
Corresponding author
Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
About this article
Cite this article
Porter, A.L., Schoeneck, D.J., Youtie, J. et al. Learning about learning: patterns of sharing of research knowledge among Education, Border, and Cognitive Science fields. Scientometrics 118, 1093–1117 (2019). https://doi.org/10.1007/s11192-019-03012-3
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-019-03012-3