Identifying research fields within business and management: a journal cross-citation analysis
A discipline such as business and management (B&M) is very broad and has many fields within it, ranging from fairly scientific ones such as management science or economics to softer ones such as information systems. There are at least three reasons why it is important to identify these sub-fields accurately. First, to give insight into the structure of the subject area and identify perhaps unrecognised commonalities; second, for the purpose of normalising citation data as it is well-known that citation rates vary significantly between different disciplines. And third, because journal rankings and lists tend to split their classifications into different subjects—for example, the Association of Business Schools list, which is a standard in the UK, has 22 different fields. Unfortunately, at the moment these are created in an ad-hoc manner with no underlying rigour. The purpose of this paper is to identify possible sub-fields in B&M rigorously based on actual citation patterns. We have examined 450 journals in B&M, which are included in the ISI Web of Science and analysed the cross-citation rates between them enabling us to generate sets of coherent and consistent sub-fields that minimise the extent to which journals appear in several categories. Implications and limitations of the analysis are discussed.
Keywordssubject fields cross-citations business and management factor analysis
- Association of Business Schools (2010). Academic journal quality guide. Association of Business Schools.Google Scholar
- Boyack K and Klavans R (2011). Multiple dimensions of journal specifity: Why journals can’t be assigned to disciplines. In: E Noyons, P Ngulube and J Leta (eds). The 13th Conference of the International Society for Scientometrics and Informetrics (Vol. I, ISSI, Leiden University and the University of Zululand, Durban, S. Africa, pp 123–133.Google Scholar
- Hair J, Anderson R, Tatham R and Black W (1998). Multivariate Data Analysis. Prentice Hall: New Jersey.Google Scholar
- Leydesdorff L (2004). Top-down decomposition of the Journal Citation Report of the Social Science Citation Index: Graph- and factor-analytical approaches. Scientometrics 60 (2): 159–180.Google Scholar
- Leydesdorff L (2006). Can scientific journals be classified in terms of aggregated journal-journal citation relations using the Journal Citation Reports? Journal of the American Society for Information Science and Technology 57 (5): 601–613.Google Scholar
- Mingers J (2014). Problems with SNIP. Journal of Informatics 8 (4): 890–894.Google Scholar
- Moed H (2010c). The source-normalized impact per paper (SNIP) is a valid and sophisticated indicator of journal citation impact. in arXiv preprint, arxiv.org.
- RAE (2004). RAE2008: Initial decisions by the UK funding bodies. Report, HEFCE, http://www.rae.ac.uk/pubs/2004/01/.
- RAE (2005). Guidance on submissions. Report, HEFCE, http://www.rae.ac.uk/pubs/2005/03/.
- RAE (2006). RAE2008 Panel criteria and working methods. Report No. 01/2006, HEFCE, http://www.rae.ac.uk/pubs/2006/01/.
- RAE (2009). RAE2008 subject overview reports: I 36 business and management studies. Report, HEFCE, http://www.rae.ac.uk/pubs/2009/ov/.
- Rowlinson M, Harvey C, Kelly A, Morris H and Todeva E (2013). Accounting for research quality: Research audits and the journal rankings debate. Critical Perspectives on Accounting, available online 25 June, doi:10.1016/j.cpa.2013.05.012.Google Scholar
- Salton G and McGill M (1987). Introduction to Modern Information Retrieval. McGraw-Hill: New York.Google Scholar
- Van Raan A (2003). The use of bibliometric analysis in research performance assessment and monitoring of interdisciplinary scientific developments. Technology Assessment—Theory and Practice 1 (12): 20–29.Google Scholar
- Zhao H and Lin X (2010). A comparison of mapping algorithms for author co-citation data analysis. Proceedings of the American Society for Information Science and Technology, Vol. 47(1), pp 1–3.Google Scholar