Management International Review

, Volume 52, Issue 3, pp 317–340

Ranking International Business Institutions and Faculty Members Using Research Publication as the Measure

Update and Extension of Prior Research

Authors

  • Somnath Lahiri
    • Discipline of International Business, School of BusinessUniversity of Sydney
    • Department of Management and Quantitative Methods, College of BusinessIllinois State University
Research Article

DOI: 10.1007/s11575-011-0116-x

Cite this article as:
Lahiri, S. & Kumar, V. Manag Int Rev (2012) 52: 317. doi:10.1007/s11575-011-0116-x

Abstract

  • This study measures and ranks the productivity of academic institutions and faculty members based on the number of publications appearing in the top three core international business journals between 2001 and 2009.

  • This research serves as a useful update and extension of studies by Morrison and Inkpen (1991), Inkpen and Beamish (1994), and Kumar and Kundu (2004), which examined the top three international business journals, namely,Management International Review,Journal of International Business Studies, andJournal of World Business.

  • Copenhagen Business School, University of Miami, and University of Leeds (among institutions), and Yadong Luo, Peter J. Buckley, and Alain Verbeke (among authors) occupy the top three positions.

Keywords

International businessLeading journalsPublicationRankingAuthors

Introduction

Scholarly interest in international business (IB) is evident in the increasing numbers of (a) researchers who contribute to the field; (b) new topics and research questions that merit the attention of these scholars; (c) business schools that emphasize the inclusion of IB courses in their curricula; and (d) journals that focus on IB or international management (Chan et al.2005; Kumar and Kundu2004; Morgan and Fai2007). As institutions across the world continue to accord greater importance to publications in IB journals in determining faculty members’ eligibility for merit pay increases, tenure, and promotion, over the years the importance of IB research publications among business school faculty members has continued to grow (Chan et al.2005; Griffith et al.2008).

Recognizing the mounting importance of IB-focused research, in the past several scholars have attempted to rank business schools based on how prolific their faculty members were in publishing in top IB journals. Such studies, among others, include the ranking of (a) authors and universities publishing IB articles (Morrison and Inkpen1991; Trevino et al.2010); (b) authors, institutions and discipline content of published articles in theJournal of International Business Studies (JIBS) between 1970 and 1994 (Inkpen and Beamish1994); (c) authors and disciplines of articles published in JIBS between 1984 and 1993 (Chandy and Williams1994); (d) international business schools based on the measure of faculty publication (Kumar and Kundu2004); and (e) schools based on publication data between 1995 and 2004 for four leading international business journals (Chan et al.2006).

The role and importance of ranking studies is well documented in literature spanning a variety of disciplinary areas, from Sociology (Espeland and Sauder2007; Sauder and Espeland2009), Logistics (Carter et al.2009), Strategy (Baden-Fuller et al.2000), Finance (Zivney and Bertin1992), Economics (Grove and Wu2007), Management (Devinney et al.2008; Pisani2009; Wedlin2007; Werner2002), Marketing (Caruana et al.2009; Linton2004; Mitra and Golder2008), Information Systems (Willcocks et al.2008), Education (Sweitzer and Volkwein2009), Research Methods (Mills et al.2006), to International Business (Macharzina et al.2004; Macharzina et al.1993). In addition to their broad disciplinary appeal, ranking studies have consistently appeared in prestigious journals for a long period of time. Examples of these include Allison and Stewart (1974) inAmerican Sociological Review; Graves et al. (1982) inAmerican Economic Review; Chung and Cox (1990) inJournal of Finance; Tracy and Waldfogel (1997) inJournal of Business; Starbuck (2005) inOrganization Science; and Mitra and Golder (2008) inJournal of Marketing.

Following the globalization of business schools, ranking studies have also looked at specific geographical domains as a context of their analyses. Examples of these include Baden-Fuller et al. (2000) on research rankings of European business schools, Lahiri (2011) on India-focused publications in leading IB journals, Macharzina et al. (2004) on the evaluation of German research output in business administration, and Mudambi et al. (2008) on research rankings of Asia Pacific business schools. Globalization of business has led to an increase in international business/management-related articles in general, across business and management journals, and in specialized IB journals in particular. This has led to some ranking studies focusing on single journals [e.g., Coudounaris et al. (2009) onManagement International Review, Inkpen and Beamish (1994) onJournal of International Business Studies], and others to include a broad range of business and management journals that international business/management research has permeated [e.g., Lu (2003); Pisani (2009); Trevino et al. (2010); Werner (2002); and Werner and Brouthers (2002)].

The plethora of studies listed above is indicative of academics’ continued interest in bibliometric studies to evaluate and assess institutions and academic disciplines as well as individuals. Rankings are important because they (i) reflect and create reputations for schools and individuals (Baden-Fuller et al.2000); (ii) impact the morale and earnings of schools (Kogut2008); and (iii) pose a threat to individuals’ perceptions of their schools’ identities (Elsbach and Kramer1996). Given the persistence of budgetary constraints to fund schools, rankings are bound to gain further importance as a performance evaluation tool for the efficient allocation of funds (Macharzina et al.2004).

With regard to the above considerations, we have conducted this study, which updates previous findings by analyzing all the articles published in three core IB journals between 2001 and 2009—a period not previously examined. By adopting a timeframe of nine years (2001–2009), this study supplements the findings of Kumar and Kundu (2004), who selected 1991–2000 as their research window. In addition to the timeframe in the two studies being compared, they are also of a sufficient length to eliminate outliers (Macharzina et al.2004). We do acknowledge that other journals such as theJournal of International Management,Journal of International Marketing,International Business Review, International Marketing Review, and Multinational Business Review have entered the IB research space and have been increasing in prominence and impact over the years, and including them for our analysis would have been useful in providing a more general picture of IB research productivity. However, they would have been of little value for the purposes of comparison with previous findings. Further, the three selected journals have been in existence for over thirty years, which is significantly longer than the average lifespan of the newer IB journals mentioned above.

Apart from updating prior findings, this study extends the research by measuring and ranking the adjusted appearances of institutions and authors, in addition to calculating and ranking the total appearances of institutions and authors. That is, this study not only analyses and ranks the appearance of various institutions (universities or business schools), it also examines and ranks the appearance of faculty members representing the institutions that appear in the sampled publications. By including institutions and faculty members within the research scope, this study both updates and extends previous ranking-based research focusing on core IB journals. Updating previous ranking studies is useful for comparative purposes across time periods, especially in verifying consistency in research productivity. For example, Macharzina et al. (2004), in their update of the Macharzina et al. (1993) study, did not find much change in the ranking of German business schools. On the other hand, in an update of productivity in transportation and logistics journals, Carter et al. (2009) identified the presence of non-North American universities in the top five rankings for the first time. Our findings with regards to school rankings in this update study are substantially different from previous findings, indicating the increased global interest in and the importance of core international business publications.

Research Methodology

Consistent with the work of Kumar and Kundu (2004), this study considers three core journals in the field of international business:Management International Review (MIR),Journal of International Business Studies (JIBS) andJournal of World Business (JWB) (known asColumbia Journal of World Business until 1996). The rationale for selecting these particular journals has been explained by the authors (Kumar and Kundu2004, pp. 216–217). Specifically, the relevant journals have now been in existence for almost thirty years and they publish articles focusing on a wide variety of business topics that are international or global in nature. For example, in a study of the diffusion of international management research in the top 20 management journals, Pisani (2009) notes that during 2002–2006 over 70% of the articles published in MIR and JIBS focused on international dimensions (12 distinct categories) of management. Moreover, all three journals have been suggested as core IB journals in prior research (Acedo and Casillas2005; DuBois and Reeb2000). The inclusion of these three established journals would enable better representation of IB research publications and thereby assist in generalizing various findings. Since the publication of study by Kumar and Kundu (2004), several studies have considered these three journals together when conducting ranking-based research of publications appearing in leading IB journals (Griffith et al.2008; Lahiri2011; Quer et al.2007; Xu et al.2008). We acknowledge that IB research has permeated many other journals in addition to MIR, JIBS, and JWB and recent studies (e.g., Trevino et al.2010) have adopted the approach of focusing on IB articles published in a range of premier business and management journals in their ranking of institutions and scholars. Given that our aim in this paper is to extend and update previous findings in Kumar and Kundu (2004), we have refrained from adding new journals in this study.

To initiate our research, every article published between 2001 and 2009 (9-year widow) in the three core IB journals was downloaded using the bibliographic database ‘ProQuest’. As the focus was on research articles only, certain categories of publication were not considered. These included editorials, obituaries, errata, biblio services, book reviews, and thank you notes to reviewers. However, research notes and guest editors’ introduction of special issues were included. A total of 1098 articles (318 in MIR, 498 in JIBS, 282 in JWB) formed the final sample. The number of authors contributing to each article mostly ranged between 1 and 3 (total 991 articles). However, 107 articles were co-authored by more than three authors. The maximum number of authors for any single article was 49 (Ralston et al.2009), followed by Waldman et al. (2006), which had 41 authors. Hofstede et al. (2002) was co-authored by 17 authors, and Fu et al. (2004) had 15 authors. For each article, we reviewed the names of contributing faculty members and their institutional affiliation(s). After all the publications had been tabulated, the number of times any particular author or institution appeared in each of the three journals was recorded.

Absolute Productivity of Institutions

For assessing and ranking institutional productivity, a procedure for computing absolute (i.e., total or raw) appearance for each institution was applied. We considered the academic or non-academic institutions of which the contributing authors were members at the time of publication. Total appearance refers to the number of times an institution appeared in the research sample. Each time the appearance was observed, a credit of 1 (one) was accorded to the institution. If any article was co-authored by more than one author from the same institution, the institution was credited with more than one appearance (i.e., 2 or more, as the case may be). Following prior research (Coudounaris et al.2009; Kumar and Kundu2004; Quer et al.2007), no distinction was made regarding the order of appearance of institutions—each appearance counted as one credit. Further, no distinction was made based on the journal name—all the three journals were assumed to be equally important. Individual appearance scores resulting from publications within a particular journal (say MIR) were added to represent the summated score for that journal. The total appearance score for a particular institution in the sample was calculated by aggregating the summated scores for all three journals1.

Table 1 highlights and ranks the total appearance of the top 25 institutions during 2001–2009 in JIBS, MIR, and JWB.
Table 1

Total appearance of institutions during 2001–2009 in JIBS, MIR, and JWB

Name of the university

Rank

Total Appearance

JIBS

MIR

JWB

Chinese University of Hong Kong

1

57

34

8

15

University of Leeds

2

51

23

23

5

Copenhagen Business School

3

49

17

24

8

Rutgers University

4

38

23

9

6

University of South Carolina

4

38

19

8

11

University of London

4

38

16

15

7

University of Western Ontario

7

33

10

16

7

Michigan State University

8

32

21

1

10

University of Miami, USA

9

30

19

6

5

University of Hong Kong

10

28

20

6

2

Indiana University

11

27

22

5

0

City University of Hong Kong

12

25

21

2

2

University of Reading

12

25

12

13

0

Northeastern University

14

23

6

3

14

University of Queensland

15

21

2

9

10

Texas (A&M) University

15

21

13

6

2

York University

15

21

10

9

2

University of Calgary

18

20

8

9

3

Simon Fraser University

18

20

8

3

9

Ohio State University

20

19

13

2

4

Temple University

20

19

10

8

1

National University of Singapore

22

18

9

7

2

University of Oklahoma

22

18

11

0

7

Tilburg University

22

18

15

3

0

University of Cambridge

25

17

3

9

5

As seen in Table 1, the Chinese University of Hong Kong ranks first, followed by the University of Leeds, Copenhagen Business School, and Rutgers University. Total appearance of institutions ranges from 17–57 (i.e., 40), with the maximum range in JIBS (32), followed by MIR (24) and JWB (15). Salient differences are observed with respect to results in the previous study by Kumar and Kundu (2004). In the previous study, which used 1991–2000 as the timeframe, the four top-ranked universities were the University of Western Ontario, University of South Carolina, University of Texas at Austin, and Michigan State University. The Chinese University of Hong Kong (ranked 1st in the current study) was ranked 12th in Kumar and Kundu (2004). Similarly, the University of Leeds (currently 2nd) ranked 29th, and Rutgers University (currently 4th) ranked 17th in the earlier study. Copenhagen Business School, ranked 3rd in the current study, did not merit in the list of top 50 schools in Kumar and Kundu (2004). Further, in Kumar and Kundu (2004), total appearance ranged from 7–34 (i.e., a difference of 27). The average total appearance of the top four institutions in this study is 48.75, a 75.67% rise over 27.75—the average total appearance in the earlier study. From a geographic standpoint, the top four ranked institutions in the current study are from Hong Kong, the UK, Denmark and the USA, whereas in the earlier study the top-ranked institution was Canada based and the remaining three were from the USA.

In updating Table 2 of Kumar and Kundu (2004, p. 220), Table 2 compares the Top Ten Universities based on publications in JIBS after the study by Morrison and Inkpen (1991).
Table 2

Comparison of the top ten universities in journal of international business studies across four studies

Study by Inkpen and Beamish (1994)

Study by Morrison and Inkpen (1991)

Study by Kumar and Kundu (2004)

Present study (2010)

1970–1982

Total appearance

1980–1989

Total appearance

1991–2000

Total appearance

2001–2009

Total appearance

Columbia University

14

University of South Carolinaa

31

University of Western Ontario

25

Chinese University of Hong Kong

34

University of South Carolinaa

13

Pennsylvania

18

University of South Carolinaa

23

Rutgers University

23

Georgia State University

12

New York University

15

Georgetown University

15

University of Leeds

23

Michigan State Universitya

11

Rutgers University

14

University of Texas at Austin

14

Indiana University

22

University of Wisconsin

11

McGill University

11

Michigan State Universitya

14

City University of Hong Kong

21

New York University

10

Michigan State Universitya

11

University of Pennsylvania

10

Michigan State Universitya

21

Harvard University

9

Columbia University

11

University of Hawaii

9

University of Hong Kong

20

Ohio State University

9

Western Ontario

10

Thunderbird- The American Graduate School of International Management

9

University of Miami

19

Pennsylvania

9

Ohio State University

8

Chinese University of Hong Kong

9

University of South Carolinaa

19

University of Texas at Austin

8

University of Southern California

5

Temple University

9

Copenhagen Business School

17

aInstitutions ranked in the top ten in each of the four studies

As is evident in the above table, the Chinese University of Hong Kong made the maximum contribution to JIBS during 2001–2009 (34 total appearances), compared to the University of Western Ontario in 1991–2000, which made 25 total appearances. Three institutions that immediately follow the Chinese University of Hong Kong in this study are Rutgers University, University of Leeds, and Indiana University. The comparison in Table 2 of Kumar and Kundu (2004) with the current study indicates that institutional representation in JIBS has undergone significant change from 1991–2000 and 2001–2009. Specifically, with the exception of two universities (University of South Carolina and Michigan State University), the two lists feature entirely different institutions. Interestingly, the University of South Carolina and Michigan State University are the only two institutions that have featured in the top ten lists in all four ranking-based studies; Inkpen and Beamish (1994), Morrison and Inkpen (1991), Kumar and Kundu (2004) and the current study. It is also noteworthy that five universities out of the top ten in the current study (2001–2009 time frame) are from outside North America (three from Hong Kong and two from Europe), a significant increase from just one from Hong Kong in the top ten during the 1991–2000 timeframe. This finding of increased focus on IB research in Hong Kong Universities is in line with the rankings of Asia Pacific business schools in Mudambi et al. (2008, pp. 177–178), where four and five of the top ten schools for the 1990–2006 period are from Hong Kong, based on raw publication counts in the top 24 leading business journals and top seven management journals, respectively.

Absolute Productivity of Authors

To assess faculty (i.e., author) productivity, a similar procedure for computing the total appearance of each author was adopted. Each time an appearance was observed, a credit of 1 (one) was accorded to the author, even if several authors contributed to a particular article. This procedure was followed for all the three journals. No distinction was made regarding the order of appearance of authors—each appearance counted as one credit. Further, no distinction was made based on the journal name—all three journals were assumed to be equally important. Individual total appearance scores from the three journals were summed to calculate the aggregate score for each author. As with institutions, faculty members affiliated to different campuses of the same university were considered separately.

Table 3 highlights and ranks the top 25 individual authors during 2001–2009 in JIBS, MIR, and JWB.
Table 3

Total appearance of authors during 2001–2009 in JIBS, MIR, and JWB

Name of the author

Rank

Total appearance

JIBS

MIR

JWB

Yadong Luo

1

24

13

6

5

Peter J. Buckley

2

21

12

7

2

Paul W. Beamish

3

16

3

9

4

Alain Verbeke

4

14

7

7

0

David A. Griffith

5

12

7

0

5

Igor Filatotchev

6

11

6

2

3

Klaus E. Meyer

7

10

6

2

2

Alan M. Rugman

7

10

5

4

1

S. Tamer Cavusgil

7

10

7

1

2

Torben Pedersen

7

10

5

4

1

Trevor Buck

7

10

5

2

3

Oded Shenkar

12

9

8

1

0

Jonathan P Doh

12

9

3

2

4

Jeremy Clegg

12

9

3

5

1

Lorraine Eden

15

8

6

2

0

Sumit K. Kundu

15

8

3

4

1

Bent Petersen

15

8

2

5

1

John H. Dunning

18

7

6

1

0

Mike Peng

18

7

4

0

3

Marjorie A. Lyles

18

7

6

1

0

Masaaki Kotabe

18

7

5

1

1

Yigang Pan

18

7

3

3

1

Andrew Delios

18

7

3

4

0

Ram Mudambi

18

7

3

4

0

Pervez N. Ghauri

18

7

2

2

3

Table 3 indicates that Yadong Luo contributed the most by making the maximum number of appearances (24), followed by Peter Buckley (21), Paul Beamish (16), Alain Verbeke (14) and others. The range of total appearances is 17 (7–24), with JIBS, MIR and JWB exhibiting individual ranges of 11, 9, and 5, respectively. Author Yadong Luo was most prolific in appearance both in JIBS and JWB, whereas in MIR it was Peter Buckley who made the maximum contribution, appearing 9 times. The average total appearance score of the top four authors is 18.75. As of January 2010, the four top authors were affiliated to the University of Miami, University of Leeds, University of Western Ontario, and University of Calgary, respectively. Since Kumar and Kundu (2004) did not consider author contribution in their research, we cannot compare the current findings with similar results of their study. However, our findings support and confirm rankings for the most prolific authors in MIR conducted by Coudounaris et al. (2009). Our top four authors rank in the top five in terms of publishing articles in MIR during the 1993–2007 period. This is indicative of these authors’ continued productivity, as well as their interest in simultaneously publishing in all three core IB journals.

Adjusted Productivity of Institutions

In addition to total, raw or absolute counts, most recent ranking-based studies employ adjusted measures or counts that take into account over-crediting contributions by institutions and authors (Coudounaris et al.2009; Lahiri2011; Quer et al.2007; Xu et al.2008). For example, if two faculty members from the same institution contribute to a particular article, then the procedure for attaining total count would credit two points to the same institution. However, in order to be accurate and fair, each appearance should give the institution a half point, thereby totaling one point in all, as opposed to two. Similarly, if two authors contribute to a particular article, then they should be eligible for a half point each, based on the assumption that each of them contributed to half of the total effort needed to produce the publication. The procedure for attaining total count would grant one point to each author (as opposed to a half point), which would amount to over-crediting.

To keep abreast of recent research trends and to overcome one limitation of Kumar and Kundu (2004), we calculated adjusted appearance relating to each article to assess the adjusted productivity of the contributing institutions. For adjusted appearance, a sole-authored article resulted in a score of 1 for the affiliated institution. An article by two authors fetched an adjusted score of 0.5 for each affiliated institution, a triauthorship resulted in a score of 0.33, and so on. This methodology to calculate adjusted appearances is similar to that adopted by Macharzina et al. (2004, p. 342) in their study2. Adjusted appearance scores from three journals were summed to calculate the aggregate score for each institution in the research sample.

Table 4 shows the adjusted appearance of the top 25 Institutions during 2001–2009 in JIBS, MIR, and JWB.
Table 4

Adjusted appearance of institutions during 2001–2009 in JIBS, MIR, and JWB

Name of institution

Rank

Adjusted appearance

JIBS

MIR

JWB

Copenhagen Business School

1

24.79

7.65

12.14

5

University of Miami

2

19.91

11.58

3.83

4.5

University of Leeds

3

18.85

8.66

7.86

2.33

Chinese University of Hong Kong

4

18.56

10.57

2.66

5.33

University of South Carolina

5

17.98

9.16

3.66

5.16

Rutgers University

6

16.16

9.5

4.16

2.5

University of Western Ontario

7

14.42

4.68

6.83

2.91

University of London

8

12.9

4.71

4.69

3.5

Simon Fraser University

9

10.97

4.52

2.2

4.25

Northeastern University

10

10.83

3.25

1.5

6.08

University of Reading

11

10.81

5.83

4.98

0

University of Hong Kong

12

10.33

7.75

2.17

0.41

Indiana University

13

9.9

7.82

2.08

0

Ohio State University

14

9.66

6.83

1.5

1.33

University of Cambridge

15

9.19

1.33

6.11

1.75

University of Melbourne

16

9.16

4.66

2.5

2

Erasmus University

17

9.11

5.95

1.66

1.5

Michigan State University

18

9.01

5.1

0.5

3.41

Tilburg University

19

8.68

6.18

2.5

0

National University of Singapore

20

8.21

3.72

3.16

1.33

City University of Hong Kong

21

7.87

6.21

0.83

0.83

University of Queensland

22

7.67

0.66

3.91

3.1

Hong Kong Baptist University

23

7.66

1.41

3.25

3

York University

24

7.58

3.75

3.08

0.75

University of Oklahoma

25

7.49

5.26

0

2.23

Table 4 indicates that the adjusted appearances range from 7.49–24.79, with the top four institutions being Copenhagen Business School, University of Miami, University of Leeds and Chinese University of Hong Kong. These four institutions also rank in the top ten based on absolute counts of publications in JIBS, MIR and JWB (see Table 3), with three of them occupying the top three ranks based on absolute counts. This shows that there is some degree of correlation between adjusted and absolute counts. The range of adjusted appearance in JIBS is 10.92 (0.66–11.58), while in MIR it is 12.14 (0–12.14) and 6.08 (0–6.08) in JWB. The University of Miami achieved the maximum adjusted appearance score (11.58) in JIBS, while Copenhagen Business School obtained the highest score (12.14) in MIR. Northeastern University, with a score of 6.08, made the highest contribution in JWB. The average adjusted appearance score of the top four institutions in the overall sample is 20.53.

To aid simultaneous comparison of total and adjusted appearances of institutions and their relative ranking, Table 5 highlights and ranks the top 25 institutions across the three journals. The ranking is ordered first by the number of adjusted appearances, followed by the number of total appearances, as has been done in prior research (Coudounaris et al.2009; Lahiri2011; Xu et al.2008).
Table 5

Total and adjusted appearance of institutions during 2001–2009 in JIBS, MIR, and JWB

JIBS

MIR

JWB

Name of institution

Rank

Total appearance

Adjusted appearance

Name of institution

Rank

Total appearance

Adjusted appearance

Name of institution

Rank

Total appearance

Adjusted appearance

University of Miami

1

19

11.58

Copenhagen Business School

1

24

12.14

Northeastern University

1

14

6.08

Chinese University of Hong Kong

2

34

10.57

University of Leeds

2

23

7.86

Chinese University of Hong Kong

2

15

5.33

Rutgers University

3

23

9.5

University of Western Ontario

3

16

6.83

University of South Carolina

3

11

5.16

University of South Carolina

4

19

9.16

University of Cambridge

4

9

6.11

Copenhagen Business School

4

8

5

University of Leeds

5

23

8.66

University of Reading

5

13

4.98

University of Miami

5

5

4.5

Indiana University

6

22

7.82

University of London

6

15

4.69

Simon Fraser University

6

9

4.25

University of Hong Kong

7

20

7.75

University of Hohenheim

7

6

4.33

Katholieke Universiteit, Belgium

7

8

3.66

Copenhagen Business School

8

17

7.65

Rutgers University

8

9

4.16

University of London

8

7

3.5

Ohio State University

9

13

6.83

University of Queensland

9

9

3.91

Michigan State University

9

10

3.41

Harvard Business school

10

12

6.56

University of Miami

10

6

3.83

University of Queensland

10

10

3.1

City University of Hong Kong

11

21

6.21

University of South Carolina

11

8

3.66

Yonsei University

11

6

3.08

Tilburg University

12

15

6.18

University of Bath

12

7

3.5

Hong Kong Baptist University

12

4

3

University of Pennsylvania

13

10

6.05

Nanyang Technological University

13

5

3.42

University of Western Ontario

13

7

2.91

Erasmus University

14

12

5.95

Hong Kong Baptist University

14

6

3.25

Bradford University

14

5

2.86

University of Reading

15

12

5.83

National University of Singapore

15

7

3.16

Boğaziçi University

15

8

2.5

University of Oklahoma

16

11

5.26

York University

16

9

3.08

University of Mississippi

16

7

2.5

University of Minnesota

17

12

5.2

Thunderbird

17

6

3

Rutgers University

17

6

2.5

Michigan State University

18

21

5.1

Temple University

18

8

2.96

University of Nebraska-Lincoln

17

6

2.5

Texas (A & M) University

19

13

5.03

Saint Louis University

19

7

2.83

Hong Kong Polytechnic University

19

5

2.33

New York University

20

10

4.85

University of New South Wales

20

6

2.75

University of Leeds

19

5

2.33

University of London

21

16

4.71

Chinese University of Hong Kong

21

8

2.66

University of Manchester

19

5

2.33

University of Western Ontario

22

10

4.68

Florida International University

21

7

2.66

Loyola Marymount University

22

4

2.33

University of Melbourne

23

9

4.66

University of Portland

21

4

2.66

University of Oklahoma

23

7

2.23

INSEAD, Singapore

24

8

4.66

University of Calgary

24

9

2.56

Georgetown University

24

4

2.125

Simon Fraser University

25

8

4.52

Yonsei University

25

6

2.5

University of North Carolina at Chapel Hill

25

5

2

Hong Kong Polytechnic University

26

8

4.33

Florida Atlantic University

26

5

2.5

Villanova University

25

5

2

    

University of Melbourne

26

5

2.5

    

Table 5 indicates that the four top-ranked institutions in JIBS are the University of Miami, Chinese University of Hong Kong, Rutgers University, and University of South Carolina, whereas the top four in MIR are Copenhagen Business School, University of Leeds, University of Western Ontario, and University of Cambridge. The four top-ranked institutions in JWB are Northeastern University, Chinese University of Hong Kong, University of South Carolina and Copenhagen Business School. There was not a single institution that was consistently present in the top five in all three journals. This shows that the editorial home base of a journal may have an influence in attracting submissions from institutions in their region. For example, three out of the top four institutions publishing in JIBS (editorial home base in the USA) are US institutions, while three out of the top four institutions publishing in MIR (editorial home base in Germany) are European institutions. JWB presents a more balanced picture in this regard.

Adjusted Productivity of Authors

To assess the adjusted productivity of authors, the adjusted appearance of each faculty member was computed, as was done for the institutions. Any author who published an article as a single author was accorded a score of 1. An article co-authored by two authors counted as 0.5 for each author; an article co-authored by three authors counted as 0.33 for each author, and so on [see Macharzina et al. (2004, p. 342)]. Individual adjusted appearance scores from respective publications were summed to calculate the aggregate adjusted score for each author. Table 6 shows the adjusted appearance of the top 25 authors during 2001–2009 in JIBS, MIR, and JWB.
Table 6

Adjusted appearance of authors during 2001–2009 in JIBS, MIR, and JWB

Name of author

Rank

Total adjusted appearance

JIBS

MIR

JWB

Yadong Luo

1

17.24

8.74

4

4.5

Peter J. Buckley

2

9.30

5.66

2.31

1.33

Alain Verbeke

3

7.33

4

3.33

0

Klaus E. Meyer

4

6.75

4.5

0.75

1.5

Paul W. Beamish

5

6.14

1.16

3.65

1.33

John H. Dunning

6

5.83

4.83

1.0

0

Alan M. Rugman

7

5

2.5

2

0.5

David A. Griffith

8

4.64

2.23

0

2.41

Oded Shenkar

9

4.49

3.99

0.5

0

Hemant Merchant

10

4

0

3

1

Michael J. Enright

11

3.7

1.2

2.5

0

Trevor Buck

12

3.66

1.58

0.5

1.58

Torben Pedersen

13

3.53

1.7

1.5

0.33

Shih-Fen S. Chen

14

3.5

3.5

0

0

Paul D. Ellis

14

3.5

2.5

0.5

0.5

Jean-François Hennart

14

3.5

2.5

1

0

Bent Petersen

17

3.49

0.66

2.5

0.33

Jonathan P. Doh

18

3.41

1

0.75

1.66

Mike W. Peng

19

3.33

2.33

0

1

Alvaro Cuervo-Cazurra

19

3.33

2.83

0.5

0

Andrew Delios

21

3.16

1.33

1.83

0

Sumit K. Kundu

22

3.15

1.16

1.66

0.33

Igor Filatotchev

23

3.11

1.83

0.5

0.78

Ram Mudambi

24

3.08

1.33

1.75

0

Pervez N. Ghauri

24

3.08

0.75

0.83

1.5

According to Table 6, total adjusted appearances range from 17.24–3.08, with the top four authors being Yadong Luo (17.24), Peter J. Buckley (9.55), Alain Verbeke (7.33), and Klaus E Meyer (6.58). The range of adjusted appearance in JIBS is 8.74 (8.74–0), in MIR is 4.0 (4.0–0) and in JWB is 4.50 (4.50–0). Yadong Luo attained the maximum adjusted appearance score in all the three journals—8.74 in JIBS, 4.0 in MIR, and 4.5 in JWB. The average score of the top four authors is 10.153. As was the case for institutions, the author rankings based on adjusted and absolute scores exhibit relatively high correlation. Three of the top four authors based on adjusted scores (Table 6) are in the top four positions based on the absolute scores (Table 3).

To aid a simultaneous comparison of the total and adjusted appearance of authors and their relative ranking, Table 7 highlights and ranks the top 25 authors across the three journals. As with institutions, the ranking is ordered first by the number of adjusted appearances, followed by the number of total appearances.
Table 7

Total and adjusted appearance of authors during 2001–2009 in JIBS, MIR, and JWB

JIBS

MIR

JWB

Name of author

Rank

Total appearance

Adjusted appearance

Name of author

Rank

Total appearance

Adjusted appearance

Name of author

Rank

Total appearance

Adjusted appearance

Yadong Luo

1

13

8.74

Yadong Luo

1

6

4

Yadong Luo

1

5

4.5

Peter J. Buckley

2

12

5.66

Paul W. Beamish

2

9

3.65

Eunmi Chang

2

3

2.5

John H. Dunning

3

6

4.83

Alain Verbeke

3

7

3.33

Snejina Michailova

2

3

2.5

Klaus E Meyer

4

6

4.5

Hemant Merchant

4

4

3

David A. Griffith

4

5

2.41

Alain Verbeke

5

7

4

Lei Li

5

4

2.66

Maddy Janssens

5

4

2.16

Oded Shenkar

6

8

3.99

Bent Petersen

6

5

2.5

Daniel J. McCarthy

6

4

2

Shih-Fen S Chen

7

4

3.5

Christos N. Pitelis

6

3

2.5

Sheila M. Puffer

6

4

2

Alvaro Cuervo-Cazurra

8

4

2.83

Michael J. Enright

6

3

2.5

Chris Rowley

8

4

1.75

David A Ralston

9

5

2.77

Peter J. Buckley

9

7

2.31

Michael R. Czinkota

9

3

1.75

Bernard Yeung

10

5

2.66

Alan M. Rugman

10

4

2

Jonathan P. Doh

10

4

1.66

Alan M. Rugman

11

5

2.5

Alfredo J. Mauri

11

3

2

Michael G. Harvey

11

4

1.58

Jean-François Hennart

12

4

2.5

Eric W. K. Tsang

12

2

2

Trevor Buck

12

3

1.58

Paul D Ellis

13

3

2.5

Jan Hendrik Fisch

12

2

2

Pervez Ghauri

13

3

1.5

Lorraine Eden

14

6

2.33

Lilach Nachum

12

2

2

Klaus E. Meyer

14

2

1.5

Chuck C Y Kwok

15

5

2.33

Andrew Delios

15

4

1.83

Richard B. Peterson

14

2

1.5

Mike Peng

16

4

2.33

Volker Mahnke

15

4

1.83

Thang V. Nguyen

14

2

1.5

Anoop Madhok

17

3

2.33

Hongxin Zhao

15

4

1.83

Paul W. Beamish

17

4

1.33

David A Griffith

18

7

2.22

Denice E. Welch

18

3

1.83

David M. Schweiger

18

3

1.33

William Newburry

19

4

2.16

Farok J. Contractor

18

3

1.83

Mila B. Lazarova

18

3

1.33

Marjorie A. Lyles

20

6

2.11

Robert Pearce

18

3

1.83

Paula Caligiuri

18

3

1.33

Chung-Ming Lau

21

5

2

Ram Mudambi

21

4

1.75

Rosalie L. Tung

18

3

1.33

Michael A Witt

22

3

2

Markus Venzin

22

4

1.66

Yongsun Paik

18

3

1.33

Witold J. Henisz

22

3

2

Sumit K. Kundu

22

4

1.66

Deli Yang

23

2

1.33

Ravi Ramamurty

24

2

2

Torben Pedersen

24

4

1.5

Peter J. Buckley

23

2

1.33

Tamir Agmon

24

2

2

Julian Birkinshaw

25

3

1.5

Pawan Budhwar

25

3

1.16

Ingmar Bjorkman

26

6

1.94

Nicholas A. Athanassiou

25

3

1.5

Seung-Hyun Lee

26

3

1.08

Table 7 suggests that the four top-ranked authors in JIBS are Yadong Luo, Peter J. Buckley, John H. Dunning, and Klaus E. Meyer, whereas the top four in MIR include Yadong Luo, Paul W. Beamish, Alain Verbeke, and Hemant Merchant. The four top-ranked authors in JWB are Yadong Luo, Eunmi Chang, Snejina Michailova, and David A. Griffith. The most consistent author is Yadong Luo, who ranked first in each of the three journals. It is more difficult to clearly identify any regional association between the author’s home and the journal they have been publishing in; at an institutional level, this association was more readily noticeable.

Consistency of Institutions Across Journals

To exhibit the consistency of institutional appearances, Table 8 highlights institutions that featured in the top ten in all three journals.
Table 8

Consistency of institutions across the top three international business journals

Institutions ranked in top ten in three journals

2001–2009

Copenhagen Business School

JIBS, MIR,JWB

University of Miami

JIBS, MIR,JWB

Institutions Ranked in Top Ten in two Journals

 

University of Leeds

JIBS, MIR

Chinese University of Hong Kong

JIBS, JWB

University of South Carolina

JIBS, JWB

Rutgers University

JIBS, MIR

University of Queensland

MIR, JWB

The table suggests there only two such institutions; Copenhagen Business School (total and adjusted appearance 49 and 24.79, respectively) and University of Miami (total and adjusted appearance 30 and 19.91, respectively). Table 8 also highlights four institutions that feature in any two of the three journals’ top ten list. The names of the institutions as well as their total appearance and adjusted appearance are as follows: University of Leeds (51, 18.85); Chinese University of Hong Kong (57, 18.56); University of South Carolina (38, 17.98); Rutgers University (38, 16.16); and University of Queensland (21, 7.67). In the study by Kumar and Kundu (2004, p. 223) only one institution (University of Texas at Austin) was consistent across the three journals. The institutions that featured in two of the three journals included the University of Western Ontario, Michigan State University, University of South Carolina and University of Hawaii. It is noteworthy that only one university—the University of South Carolina—has consistently maintained its position in the top ten during 1991–2000 (Kumar and Kundu2004) and 2001–2009 (current study). Such volatility in rankings is difficult to explain. However, the movement of key productive faculty, funding criteria more directly linked with IB journal publications, and changes in overall focus of the department or school are some, among many, factors that offer limited explanation.

Consistency of Authors Across Journals

To exhibit consistency of author appearances, Table 9 highlights those members who featured in the top ten in all three journals.
Table 9

Consistency of authors across the top three international business journals

Authors ranked in top ten in three journals

2001–2009

Yadong Luo

JIBS, MIR,JWB

Authors Ranked in Top Ten in two Journals

 

Peter J. Buckley

JIBS, MIR

Alain Verbeke

JIBS, MIR

The table indicates that only one author, Yadong Luo, featured in all three journals. His total and adjusted appearance scores are 24 and 17.24, respectively. Two authors, Peter J. Buckley (scores 21 and 9.55) and Alain Verbeke (scores 14 and 7.33), featured in the top 10 of two of the three journals. The journals in which they featured in the top ten are JIBS and MIR. Since Kumar and Kundu (2004) did not consider the contribution of authors, we cannot compare our findings with their study. However, our rankings are quite consistent with those based on publications in only MIR from 1993–2007, as presented in Coudounaris et al. (2009).

To depict a comprehensive picture of how rankings have changed since the publication of the study by Kumar and Kundu (2004), we present below Table 10, which highlights a change in the rank of institutions identified in the current study as top 25 based on absolute appearance.
Table 10

Comparison of institutional ranking across two studies

2001–2009 Time period

1999–2000 Time period

Change in rank between the time periods

Name of university

Rank

No. of articles

Rank

No. of articles

Chinese University of Hong Kong

1

57

12

14

+ 11

University of Leeds

2

51

29

9

+ 27

Copenhagen Business School

3

49

  

Not in top 50

Rutgers University

4

38

17

11

+ 13

University of South Carolina

4

38

2

31

− 2

University of London

4

38

  

Not in top 50

University of Western Ontario

7

33

1

34

− 6

Michigan State University

8

32

4

22

− 4

University of Miami

9

30

  

Not in top 50

University of Hong Kong

10

28

14

13

+ 4

Indiana University

11

27

23

10

+ 12

City University of Hong Kong

12

25

  

Not in top 50

University of Reading

12

25

17

11

+ 5

Northeastern University

14

23

10

15

− 4

University of Queensland

15

21

  

Not in top 50

Texas (A&M) University

15

21

29

9

+ 14

York University

15

21

  

Not in top 50

University of Calgary

18

20

  

Not in top 50

Simon Fraser University

18

20

23

10

+ 5

Ohio State University

20

19

43

7

+ 23

Temple University

20

19

14

13

− 4

National University of Singapore

22

18

43

7

+ 21

University of Oklahoma

22

18

9

16

− 13

Tilburg University

22

18

  

Not in top 50

University of Cambridge

25

17

  

Not in top 50

Erasmus University

26

17

  

Not in top 50

From the table it is evident that the rankings of 10 universities have improved after 1991–2000, while the rankings of 6 institutions have diminished. There are 9 universities that did not feature in the top 50 list of Kumar and Kundu (2004) and hence their change in ranking cannot be ascertained; however, clearly, these universities have made a substantial scholarly contribution by securing a rank in the current top 25 list.

Conclusion

This study has updated and extended prior research by measuring and ranking the productivity of academic institutions and faculty members based on their number of publications appearing in top three core international business journals between 2001 and 2009. By covering a nine-year timeframe, this study has extended the research initiated by Morrison and Inkpen (1991). In particular, this study has updated the findings of Kumar and Kundu (2004), which built on earlier research, including that of Morrison and Inkpen (1991) and Inkpen and Beamish (1994). By comparing the current findings with those of two earlier time periods, this study has continued with the ranking-based research lineage initiated about two decades ago. In measuring and ranking the contributions of individual faculty members, this study has extended previous research. Specifically, the inclusion of total and adjusted scores of faculty members as authors of IB publications has enabled providing a relatively more complete picture of scholarly productivity across numerous institutions around the world (Lahiri2011; Quer et al.2007; Xu et al.2008).

In comparing the current findings with those of previous similar research, this study found that top-ranked institutions in 2001–2009 are quite different from those of 1991–2000, While the four top institutions of 1991–2000 are the University of Western Ontario, University of South Carolina, University of Texas at Austin, and Michigan State University, those of the current study include the Chinese University of Hong Kong, University of Leeds, Copenhagen Business School, and Rutgers University. However, this comparison is based on the total count of appearances. According to this study, the top-ranked institutions based on adjusted counts are Copenhagen Business School, University of Miami, University of Leeds and Chinese University of Hong Kong. One may surmise that these universities are poised to make leading contributions to IB in the future.

The top authors during the 2001–2009 period, based on total counts, are Yadong Luo, Peter J. Buckley, and Paul W Beamish, whereas based on adjusted counts the top authors for that period are Yadong Luo, Peter J. Buckley, and Alain Verbeke.

Utmost care should be taken in deriving any implications or forming any opinion regarding the institution and author rankings that we have presented in this paper. However, some general trends are worthy of mentioning. The rankings clearly show the growing importance of publishing in IB journals outside of North America, especially in Asia. Given the nature of the discipline, such a shift in rankings towards non-North American universities is only natural. With increasing global trade and investment across the globe, more nation states and private enterprises would be interested in funding IB-related research, which will lead to greater research publications. In addition, the success of the US business school model (publish or perish), with a heavy emphasis on research output, has also attracted many European and Asian schools to adopt a model somewhat similar to their US counterparts. This growing similarity in terms of research requirements and funding support has aided in the movement of IB scholars to erstwhile non-conventional schools in Europe, Asia and Australia. All of these changes are resulting in a much stronger community of IB schools and scholars located across different geographies.

As is the case with ranking studies in general, there are limitations associated with our rankings. One of the limitations of this study is the non-inclusion of other leading IB or IM journals such asJournal of International Management, International Business Review, Journal of International Marketing, and the like. Also, more recently there has been an increase in IB or IM-related articles in general business, management and strategy journals such asOrganization Science, Strategic Management Journal, andAcademy of Management Review, among others (Pisani2009; Trevino et al.2010), which we do not include in our analyses. However, such exclusion was deliberate, as we wanted to focus on three core IB journals so as to extend previous research and enable comparison of findings across different time periods in a focused manner. We do realize that recent studies (e.g., Trevino et al. 2000) that have included a large number of journals to establish ranking of IB institutions and scholars criticize the use of only a select number of IB journals in attempting to understand IB productivity. While we appreciate and commend the tremendous effort involved in investigating IB research in non-IB journals, for the purposes of our study we believe a core set of IB journals is sufficient. Including non-IB journals creates issues of appropriate selection criteria for IB research, accommodating for the significant differences in background, the preferred theoretical and methodological orientation of the different journals (that go beyond differences in impact factor scores), assessing the significance of their impact in influencing ultimate findings, and finally problems comparing findings with prior studies. Our current study aimed to compare findings particularly with those of Kumar and Kundu (2004), and as such the sample of three core IB journals was the most appropriate. The other limitation lies in the inability to identify the specific department or college of faculty members. Although some publications do mention the specific schools with which the authors are affiliated (e.g., Copenhagen Business School), most publications mention only the name of the respective university. We hope this research will encourage scholars to conduct similar update studies in the future.

Footnotes
1

The faculty members of each campus of a university were separated. For example, an article written by a faculty member affiliated to the University of Texas at Arlington was credited to the University of Texas at Arlington and not to the University of Texas at Dallas. Publications often mention the affiliation of authors in different forms. For example, while the affiliation of authors William Newburry and Liuba Belkin is shown as Rutgers Business School in Newburry et al. (2006), that of author Farok J. Contractor is shown as the Department of Management and Global business, Rutgers University, in Contractor et al. (2005). Similarly, the affiliation of Mike Wright is mentioned as Business School, Nottingham University in Strange et al.2009, while that of Chengqi Wang is Nottingham University Business School, University of Nottingham in Wang et al. (2009). In this research, no distinction was made between (a) University of Leeds and Leeds University, (b) Rutgers Business School and Rutgers University, (c) University of Nottingham and Nottingham University (d) Bocconi University and Universita Luigi Bocconi, and so on. To be accurate, the appearance of institutions such as University of London, London Business School, London School of Economics and Political Science, and King’s College were classified under one common institution—University of London.

 
2

If any publication mentioned that a particular author was affiliated to more than one institution (e.g., John Cantwell’s affiliation to Rutgers Business School and University of Reading in Cantwell et al.2004), then necessary deductions were made to compute each institution’s adjusted appearance.

 
3

As of December 2010, author Klaus E. Meyer is affiliated with the University of Bath, UK. The affiliation of other top authors has been mentioned previously.

 

Copyright information

© Gabler Verlag 2011