Journal of the Operational Research Society

, Volume 66, Issue 8, pp 1370–1384 | Cite as

Identifying research fields within business and management: a journal cross-citation analysis

General Paper

Abstract

A discipline such as business and management (B&M) is very broad and has many fields within it, ranging from fairly scientific ones such as management science or economics to softer ones such as information systems. There are at least three reasons why it is important to identify these sub-fields accurately. First, to give insight into the structure of the subject area and identify perhaps unrecognised commonalities; second, for the purpose of normalising citation data as it is well-known that citation rates vary significantly between different disciplines. And third, because journal rankings and lists tend to split their classifications into different subjects—for example, the Association of Business Schools list, which is a standard in the UK, has 22 different fields. Unfortunately, at the moment these are created in an ad-hoc manner with no underlying rigour. The purpose of this paper is to identify possible sub-fields in B&M rigorously based on actual citation patterns. We have examined 450 journals in B&M, which are included in the ISI Web of Science and analysed the cross-citation rates between them enabling us to generate sets of coherent and consistent sub-fields that minimise the extent to which journals appear in several categories. Implications and limitations of the analysis are discussed.

Keywords

subject fields cross-citations business and management factor analysis 

References

  1. Adler N and Harzing A-W (2009). When knowledge wins: Transcending the sense and nonsense of academic rankings. Academy of Management Learning and Education 8 (1): 72–95.CrossRefGoogle Scholar
  2. Ahlgren P, Jarneving B and Rousseau R (2003). Requirements for a cocitation similarity measure, with special reference to Pearson’s correlation coefficient. Journal of the American Society for Information Science and Technology 54 (6): 550–560.CrossRefGoogle Scholar
  3. Association of Business Schools (2010). Academic journal quality guide. Association of Business Schools.Google Scholar
  4. Blondel V, Guillaume J-L, Lambiotte R and Lefebvre L (2008). Fast unfolding of communities in large networks. Journal of Statistical Mechanics: Theory and Experiment 2008 (10): P10008.CrossRefGoogle Scholar
  5. Boyack K and Klavans R (2011). Multiple dimensions of journal specifity: Why journals can’t be assigned to disciplines. In: E Noyons, P Ngulube and J Leta (eds). The 13th Conference of the International Society for Scientometrics and Informetrics (Vol. I, ISSI, Leiden University and the University of Zululand, Durban, S. Africa, pp 123–133.Google Scholar
  6. Egghe L and Leydesdorff L (2009). The relation between Pearson’s correlation coefficient r and Salton’s cosine measure. Journal of the American Society for Information Science and Technology 60 (5): 1027–1036.CrossRefGoogle Scholar
  7. Hair J, Anderson R, Tatham R and Black W (1998). Multivariate Data Analysis. Prentice Hall: New Jersey.Google Scholar
  8. Hoepner A and Unerman J (2009). Explicit and implicit subject bias in the ABS Journal Quality Guide. Accounting Education 21 (1): 3–15.CrossRefGoogle Scholar
  9. Hussain S (2011). Food for thought on the ABS Academic Journal Quality Guide. Accounting Education 20 (6): 545–559.CrossRefGoogle Scholar
  10. Leydesdorff L (2004). Top-down decomposition of the Journal Citation Report of the Social Science Citation Index: Graph- and factor-analytical approaches. Scientometrics 60 (2): 159–180.Google Scholar
  11. Leydesdorff L (2006). Can scientific journals be classified in terms of aggregated journal-journal citation relations using the Journal Citation Reports? Journal of the American Society for Information Science and Technology 57 (5): 601–613.Google Scholar
  12. Leydesdorff L (2008). Caveats for the use of citation indicators in research and journal evaluations. Journal of the American Society for Information Science and Technology 59 (2): 278–287.CrossRefGoogle Scholar
  13. Leydesdorff L and Opthof T (2010a). Scopus’s source normalized impact per paper (SNIP) versus a journal impact factor based on fractional counting of citations. Journal of the American Society for Information Science and Technology 61 (11): 2365–2369.CrossRefGoogle Scholar
  14. Leydesdorff L and Opthof T (2010b). Scopus’ SNIP indicator: Reply to Moed. Journal of the American Society for Information Science and Technology 62 (1): 214–215.CrossRefGoogle Scholar
  15. Macdonald S and Kam J (2007). Ring a ring o’ roses: Quality journals and gamesmanship in management studies. Journal of Management Studies 44 (4): 640–655.CrossRefGoogle Scholar
  16. Mingers J (2014). Problems with SNIP. Journal of Informatics 8 (4): 890–894.Google Scholar
  17. Mingers J and Burrell Q (2006). Modelling citation behavior in management science journals. Information Processing and Management 42 (6): 1451–1464.CrossRefGoogle Scholar
  18. Mingers J and Lipitakis E (2010). Counting the citations: A comparison of Web of Science and Google Scholar in the field of management. Scientometrics 85 (2): 613–625.CrossRefGoogle Scholar
  19. Mingers J and Lipitakis E (2013). Evaluating a department’s research: Testing the Leiden methodology in business and management. Information Processing & Management 49 (3): 587–595.CrossRefGoogle Scholar
  20. Mingers J and Willmott H (2013). Taylorizing business school research: On the ‘one best way’ performative effects of journal ranking lists. Human Relations 66 (8): 1051–1073.CrossRefGoogle Scholar
  21. Mingers J, Watson K and Scaparra MP (2012). Estimating business and management journal quality from the 2008 Research Assessment Exercise in the UK. Information Processing and Management 48 (6): 1078–1093.CrossRefGoogle Scholar
  22. Moed H (2010a). CWTS crown indicator measures citation impact of a research group’s publication oeuvre. Journal of Informetrics 4 (3): 436–438.CrossRefGoogle Scholar
  23. Moed H (2010b). Measuring contextual citation impact of scientific journals. Journal of Informetrics 4 (3): 265–277.CrossRefGoogle Scholar
  24. Moed H (2010c). The source-normalized impact per paper (SNIP) is a valid and sophisticated indicator of journal citation impact. in arXiv preprint, arxiv.org.
  25. Moed HF, Burger W, Frankfort J and Van Raan A (1985). The use of bibliometric data for the measurement of university research performance. Research Policy 14 (3): 131–149.CrossRefGoogle Scholar
  26. Morris M, Harvey C and Kelly A (2009). Journal rankings and the ABS Journal Quality Guide. Management Decision 47 (9): 1441–1451.CrossRefGoogle Scholar
  27. Morris H, Harvey C, Kelly A and Rowlinson M (2011). Food for thought? A rejoinder on peer-review and RAE2008 evidence. Accounting Education 20 (6): 561–573.CrossRefGoogle Scholar
  28. Paul R (2007). Challenges to information systems: Time to change. European Journal of Information Systems 16 (3): 193–195.CrossRefGoogle Scholar
  29. Paul R (2008). Measuring research quality: The United Kingdom government’s Research Assessment Exercise. European Journal of Information Systems 17 (4): 324–329.CrossRefGoogle Scholar
  30. Pudovkin A and Garfield E (2002). Algorithmic procedure for finding semantically related journals. Journal of the American Society for Information Science and Technology 53 (13): 1113–1119.CrossRefGoogle Scholar
  31. RAE (2004). RAE2008: Initial decisions by the UK funding bodies. Report, HEFCE, http://www.rae.ac.uk/pubs/2004/01/.
  32. RAE (2005). Guidance on submissions. Report, HEFCE, http://www.rae.ac.uk/pubs/2005/03/.
  33. RAE (2006). RAE2008 Panel criteria and working methods. Report No. 01/2006, HEFCE, http://www.rae.ac.uk/pubs/2006/01/.
  34. RAE (2009). RAE2008 subject overview reports: I 36 business and management studies. Report, HEFCE, http://www.rae.ac.uk/pubs/2009/ov/.
  35. Rafols I and Leydesdorff L (2009). Content-based and algorithmic classifications of journals: Perspectives on the dynamics of scientific communication and indexer effects. Journal of the American Society for Information Science and Technology 60 (9): 1823–1835.CrossRefGoogle Scholar
  36. Rafols I, Leydesdorff L, O’Hare A, Nightingale P and Stirling A (2012). How journal rankings can suppress interdisciplinary research: A comparison between innovation studies and business & management. Research Policy 41 (7): 1262–1282.CrossRefGoogle Scholar
  37. Rinia EJ, van Leeuwen TN, van Vuren HG and van Raan AFJ (1998). Comparative analysis of a set of bibliometric indicators and central peer review criteria: Evaluation of condensed matter physics in the Netherlands. Research Policy 27 (1): 95–107.CrossRefGoogle Scholar
  38. Rowlinson M, Harvey C, Kelly A, Morris H and Todeva E (2013). Accounting for research quality: Research audits and the journal rankings debate. Critical Perspectives on Accounting, available online 25 June, doi:10.1016/j.cpa.2013.05.012.Google Scholar
  39. Salton G and McGill M (1987). Introduction to Modern Information Retrieval. McGraw-Hill: New York.Google Scholar
  40. Steinley D (2004). Properties of the Hubert-Arabie adjusted Rand index. Psychological Methods 9: 386–396.CrossRefGoogle Scholar
  41. Van Raan A (2003). The use of bibliometric analysis in research performance assessment and monitoring of interdisciplinary scientific developments. Technology Assessment—Theory and Practice 1 (12): 20–29.Google Scholar
  42. Van Raan A, van Leeuwen T, Visser M, van Eck N and Waltman L (2011). Rivals for the crown: Reply to Opthof and Leydesdorff. Journal of Informetrics 4 (3): 431–435.CrossRefGoogle Scholar
  43. Waltman L, van Eck N, van Leeuwen T and Visser M (2013). Some modifications to the SNIP journal impact indicator. Journal of Informetrics 7 (2): 272–285.CrossRefGoogle Scholar
  44. Willmott H (2011). Journal list fetishism and the perversion of scholarship: Reactivity and the ABS list. Organization 18 (4): 429–442.CrossRefGoogle Scholar
  45. Zhao H and Lin X (2010). A comparison of mapping algorithms for author co-citation data analysis. Proceedings of the American Society for Information Science and Technology, Vol. 47(1), pp 1–3.Google Scholar

Copyright information

© Operational Research Society Ltd. 2014

Authors and Affiliations

  1. 1.University of KentCanterburyUK
  2. 2.University of AmsterdamAmsterdamThe Netherlands

Personalised recommendations