A new approach to data access and research transparency (DART)

Abstract

Recent debates on transparency and replicability suggest that JIBS needs to update its approach on data access and research transparency (DART). We propose a series of initiatives, knowing well that there is a balance to be struck. There are clear benefits on the one hand, chief among these the potential for learning and knowledge accumulation, and equally manifest challenges on the other: the imperative to respect privacy, confidentiality, and intellectual property rights. Without addressing these challenges, will there be the high-quality data on which the benefits depend? We present access and transparency objectives, and set out how an actionable and effective approach towards DART will be implemented, but also address ethical, legal, and organizational challenges of concern to us as a scholarly community.

Résumé

Les récents débats sur la transparence et la reproductibilité suggèrent que JIBS doit actualiser son approche sur l’accès aux données et la transparence de la recherche (ADTR). Sachant bien qu’il y a un équilibre à trouver, nous proposons une série d’initiatives. D’une part, il existe des avantages évidents, dont le principal est le potentiel d’apprentissage et d’accumulation des connaissances, et, d’autre part, des défis tout aussi manifestes : l’impératif de respecter la vie privée, la confidentialité et les droits de propriété intellectuelle. Si l’on ne relève pas ces défis, y aura-t-il les données de haute qualité dont dépendent ces avantages ? Nous présentons des objectifs en matière d’accès et de transparence, et nous exposons comment une approche efficace et réalisable de l’ADTR sera mise en œuvre, mais nous abordons également les défis éthiques, juridiques et organisationnels qui nous préoccupent en tant que communauté scientifique.

Resumen

Los recientes debates sobre la transparencia y la replicabilidad sugieren que JIBS debe actualizar su enfoque sobre el acceso a los datos y la transparencia de la investigación (DART por sus iniciales en inglés). Proponemos una serie de iniciativas, sabiendo bien que hay un equilibrio a ser alcanzado. Hay claros beneficios, por un lado, entre ellos el potencial de aprendizaje y acumulación de conocimientos, y desafíos igualmente manifiestos por otro: el imperativo de respetar la privacidad, la confidencialidad y los derechos de propiedad intelectual. Sin abordar estos desafíos, ¿habrá datos de alta calidad de los que dependen los beneficios? Presentamos los objetivos de acceso y transparencia y establecemos cómo se implementará un enfoque práctico y eficaz hacia el acceso a los datos y la transparencia de la investigación, pero también abordamos los desafíos éticos, legales y organizacionales que nos preocupan como comunidad académica.

Resumo

Debates recentes sobre transparência e replicabilidade sugerem que o JIBS precisa atualizar sua abordagem sobre acesso a dados e transparência em pesquisa (DART). Propomos uma série de iniciativas, bem sabendo que há um equilíbrio a ser alcançado. Por um lado, há benefícios claros, dentre eles o potencial de aprendizado e acumulação de conhecimento, e, por outro, desafios igualmente manifestos: o imperativo de respeitar a privacidade, confidencialidade e direitos de propriedade intelectual. Sem resolver esses desafios, haverá dados de alta qualidade dos quais os benefícios dependem? Apresentamos objetivos de acesso e transparência e definimos como uma abordagem acionável e eficaz em relação a DART será implementada, mas também abordamos desafios éticos, legais e organizacionais que nos interessam como comunidade acadêmica.

摘要

最近有关透明度和可复制性的辩论表明, JIBS需要更新其有关数据获取和研究透明度(DART)的方法。我们深知要做到平衡, 因此建议采取一系列举措。这一方面有明显的好处, 其中最主要的是学习和知识积累的潜力; 另一方面, 同样也显示出挑战: 尊重隐私, 保密性和知识产权都势在必行。在不解决这些挑战的情况下, 是否会有受益所依赖的高质量数据?我们提出了数据获取和透明度目标, 并阐明了如何实施可操作且有效的DART方法, 同时也解决了我们作为学术界所关注的道德、法律和组织挑战。

INTRODUCTION

The need for access to research data and for transparency of the research process have been discussed extensively across the social sciences (Miguel et al., 2014; Nosek, 2015). Many academic journals have revised their policies, recognizing the value of cumulative knowledge creation and providing a good foundation for it by calling for enhanced evidence trails and reanalysis of data. There are clearly advantages to having open access to data, but there are also justifiable ethical and legal concerns about data sharing.

We introduce a new Journal of International Business Studies (JIBS) approach for Data Access and Research Transparency (DART).1 The overarching aim of the new approach is to further cumulative knowledge by encouraging the sharing of research data. However, the sharing of data is not always feasible or appropriate. Therefore, we aim for an approach that enhances transparency in an actionable, sensitive, and pragmatic way, while at the same time enabling researchers to pursue a wide range of research methodologies.

We begin by setting out the motivations behind the new approach. We then look at current practices, in social sciences broadly, and international business and management specifically. We also outline our expectations of the impact of the new approach on the scientific process of knowledge creation, and in tandem the legal, ethical, and organizational ramifications of sharing complete datasets, knowing well that it can be challenging, even inadvisable sometimes. Our synthesis is intended to help authors make informed decisions about data management. We also broach taking a stance on DART, and communicating it to editors and reviewers. The members of the JIBS editors team see the new DART approach as a logical extension of the Academy of International Business (AIB) Journals Code of Ethics instituted nearly a decade ago and revised in 2018 (Eden, 2010).2 The new JIBS DART approach has been endorsed by the Executive Board of the AIB.

HOW EDITORIAL POLICY IS CHANGING

Transparency of Research

Transparency is a key attribute of scholarly research (Eden, 2010), from data production to analysis. For quantitative empirical studies – including hypothesis-testing studies – we see full transparency as comprised of an easy-to-follow explanation of the way in which data are generated or collected and also of the procedures used to reach conclusions. At the far ends of the continuum are “no disclosure” and “full transparency”. Full transparency is the full disclosure of all data, clear access to the codes to run the analysis, and a full explanation of the way the data have been collected, such that all tables and figures in a published article can be reproduced.

The term full transparency needs to be modified depending on the type of data used (for qualitative data, see Bluhm et al., 2011; Elman & Kapiszewski, 2014; Monroe, 2018; Pratt, Kaplan, & Whittington, 2020). Different organizations and journals have undertaken different transparency-enhancing initiatives. For instance, the American Economic Association has launched an online registry for randomized controlled trials, and Perspectives on Psychological Science and many other journals offer authors the possibility of peer review of their research design with approval basically amounting to conditional acceptance – even before data collection (Miguel et al., 2014). Other journals give editors the latitude to ask authors to provide raw output files, or even raw data and associated statistical programming files, which would permit reproducing results. These policies have teeth as non-compliance is a potential reason for the rejection of a manuscript.

Transparency has to do with process. The peer-review process has a long history (Lee et al., 2013), and is aptly described as “a system of institutionalized vigilance” (Merton, 1973: 339). Editors and reviewers expect authors to provide a detailed description of the methodology they use, for example, sampling procedures, data sources, variable measurements, and descriptive statistics. As shown in Table 1, the editors of some journals also request intermediate data-processing documents such as codes or outputs from analytical software, or questionnaires in the original language. Though these materials are rarely published, their very submission as part of the review process enhances the credibility of the published results.

Table 1 Examples of data access and research transparency (DART) policies in business and management journals

As leading good practice, some international business scholars voluntarily make available the datasets or indices used in their studies. Dow and Karunaratna (2006) make their data on country-level determinants of psychic distance available on a personal website.3 Berry, Guillen, & Zhou (2010) make available on university websites their data on cross-national distance,4 and Schotter and Beamish (2013) their data on the hassle factor, a multi-item measure of the difficulty of doing business in a country.5 The original individual-level data used by Schwartz (2006) to develop dimensions of national cultural value orientation are available for a small fee from the Israeli Social Science Data Center. These authors share country-level indicators. However, many international business research questions concern firm-, team-, or individual-level constructs. For reasons discussed below, the sharing of such micro-level data is less common.

JIBS Current Editorial Policy

Transparency is addressed in the AIB Journals Code of Ethics (Eden, 2018).6 Items 3.5.2 and 3.5.3 require authors to be fully transparent in the handling of data.

Authors should report their findings fully and should not omit data that are relevant within the context of the research question(s). Results should be reported whether they support or contradict expected outcomes. Authors should take care to present relevant qualifications to their research or to the findings and interpretations of them. Underlying assumptions, theories, methods, measures, and research designs relevant to the findings and interpretations of their work should be disclosed. (item 3.5.2)

The manuscript should contain sufficient detail and references to permit peers with access to the same dataset to repeat the work. (item 3.5.3.)

JIBS and most of its peer journals adhere to good research practices, and enforce them in part through the review process. The AIB Journals Code of Ethics does not require data sharing, but emphasizes transparency. Item 3.1.6 addresses the disclosing of previous usage of data.

The manuscript should identify the origin, and originality, of any proprietary, non‐standard datasets used in the paper, for example, a primary dataset created by the Author using a survey. If the proprietary dataset has been used elsewhere by this or another Author, the manuscript should cite these other works, whether published or not. (item 3.1.6)

This item also addresses using and reusing of materials, making only the most subtle of changes (Honig et al., 2018; Kirkman & Chen, 2011), a practice bordering on self-plagiarism, or what Broad (1981) refers to pejoratively as “salami publications” for manuscripts that are strikingly similar to other papers in hypotheses, samples, methodologies and results-and often in authorship-all the while attempting to avoid textual overlap.7 When a dataset has been used in a previous study, regardless of whether a different theory is used to analyze the data, it must be made clear what value is added beyond earlier published work.

The reuse of data also creates challenges for the efficacy of the double-blind review, which requires authors to remove any information in a manuscript that might allow a reviewer to infer who the author is. Thus multiple uses of the same data can conflict with the requirement to cite earlier work. To address this dilemma, some journals require authors to submit in a separate document, to be seen only by the editor, an explanation of any similarities or differences in key variables between those in the submission and those in earlier published work using the same dataset (see Colquitt, 2013, for data overlap policies at AMJ). For an example of how such a document looks, see the Journal of Applied Psychology submission guidelines drawn up by the American Psychological Association (APA).8 At present, JIBS does not formally require that such a document be joined to the submission package, but the JIBS editors do welcome such information and often request it during the review process.

New DART Requirements

The DART approach we propose builds on the AIB Journals Code of Ethics. We do not call for radical changes. There are pressing legitimacy concerns that need to be addressed now; further down the road, other issues of concern to authors, publishers, editorial boards, universities, and research foundations will be addressed. JIBS DART policies are a work in progress.

We list below ways in which the JIBS editorial guidelines will be modified. The new approach will be posted on the JIBS website as part of the Statement of Editorial Policy, and also be made a part of the online submission system.

  1. 1.

    JIBS is the official publication of the Academy of International Business. Authors who wish to submit work for consideration are required to comply with the AIB Journals Code of Ethics. As spelled out in clauses 3.1.6, 3.5.2., and 3.5.3, at manuscript submission an author must inform the editor of (1) all prior use of any data, whether in part or whole, be it by a single author or with any number of co-authors, and (2) how the manuscript makes a contribution over and above any earlier published work in which the data were used – for example, through the use of different theories or methodologies. The required information may be provided in a cover letter for the editor’s eyes only. Failure to provide such information may be grounds for manuscript rejection.

  2. 2.

    It will be normal practice for JIBS handling Editors to ask, on their own or at the request of a reviewer, for supplementary information such as research documents and raw output files – although not the actual raw data. An editor might request additional descriptive statistics or robustness tests, programming codes used for running regressions, output files from statistical analysis software, or questionnaires in their original language. Authors will have the option of sharing the data and/or raw output files with only the editor and only if/when it is requested for a revision. In the rare case that the editor needs to delegate checking the data, a non-anonymous consulting editor could take on this role. Authors should disclose in their paper the original language in which research was undertaken (this specifically applies to qualitative research), and describe the steps they have taken to maintain its meaning when reporting it in English. Again, failure to comply may be grounds for manuscript rejection. Editors who request and receive access to an author’s raw output files will not use this information in their own work or share it with others.

  3. 3.

    Authors will be required to include in their submission a data sharing “comply or explain” statement. There are a number of reasons to delay making data fully available, and reasons why full compliance is not achievable, or even desirable (as explained in detail below). Some reasons are spelled out in the statement, and it is possible to add others:

    1. a.

      I will make the data fully available upon acceptance

    2. b.

      I will make the data fully available after an embargo period (maximum 5 years)

    3. c.

      I will not make the data fully available due to

      1. (i)

        Protection of personal data of research subjects

      2. (ii)

        third-party property rights

      3. (iii)

        national security

      4. (iv)

        other reasons (please specify)

If an author has reasons to “opt out” other than i, ii, or iii, these will need to be spelled out. Assuming a manuscript is accepted for publication, the author will be asked to confirm the intentions declared in the statement filed at the time of submission. Each article published in JIBS, will be followed by an editorial note that informs readers of the extent to which data are available, and the reasons for less than full availability. Authors who make their data fully available will be asked to store these in a reputable data repository, and to provide information on how to access these data. Information on how to access data, including its Digital Object Identifier (DOI), will appear below the article, and enable citations to the dataset.

In order to disseminate as widely as possible the above new approach, JIBS will sponsor professional development workshops and other activities at future annual AIB conferences.

What Other Journals Do

JIBS is introducing a new DART approach against the backdrop of growing concerns across the social sciences about the validity of research outcomes. Many of the top social sciences journals require the authors of articles accepted for publication to provide all data and the associated codes needed to reproduce tables and figures. Leading journals such as the American Economic Review, the American Journal of Political Science, and the American Sociological Review have made full transparency compulsory. The American Political Science Association adopted DART policies in 2012 (Lupia & Elman, 2014; Monroe, 2018). Many other journals require partial transparency. The Journal of Applied Psychology, for example, requires authors to make data available upon request. We provide in the “Appendix” a detailed overview of the DART policies of these and other journals.

Many of the top journals in economics, sociology and political science have instituted a “comply or explain” policy similar to the approach we propose above. Currently, most business and management journals require a description of the research process, but not the sharing of data (see Table 1). Neither the Journal of Management (JoM), nor the Strategic Management Journal (SMJ) call for full transparency. JoM editors may ask for more detail during the review process. At SMJ, data sharing is entirely voluntary, although it is encouraged. The Journal of Marketing has no formal policy, but requires authors to back up results for 5 years after paper acceptance.9 The Academy of Management Journal (AMJ), to the best of our knowledge, has no explicit DART policy, although it requires adherence to the Academy of Management Code of Ethics and, along with many other top management journals, is a member of the Committee On Publication Ethics (COPE).10 Finally, Administrative Science Quarterly, as far as we have been able to determine, does not have a DART policy, but does have an extensive Code of Ethics addressing the issue of transparency.

The picture that emerges is that top economics, sociology, psychology, and political science journals are leading the quest for more transparency. Business and management journals are engaged in a catching-up process, but struggle addressing the possible side effects of full disclosure. We discuss those below.

WHY WE NEED AN EDITORIAL POLICY CHANGE

Why does the transparency of research data need to be enhanced? Serious concerns about non-replicability of research findings have come to the fore in economics (e.g., Camerer et al., 2016), political science (e.g., Lupia & Elman, 2014), psychology (e.g., Makel & Plucker, 2017), sociology (e.g., Freese & King, 2018), management (e.g., Aguinis, Ramani, & Alabduljader, 2018; Honig et al., 2018), strategy (e.g., Bettis, Helfat, & Shaver, 2016b) and entrepreneurship (Anderson, Wennberg, & McMullen, 2019). The authors we cite above advance two sets of arguments as to why lack of transparency and replicability should be of utmost concern to scholars. First, the creation of a cumulative body of scientific knowledge is based on continuous learning (Popper, 1962) and this requires the possibility of independent verification of scholarly work through re-testing hypotheses using the same or alternative data and methodologies. Replication as a quality-related criterion is especially relevant for quantitative research, but rarely applicable to qualitative research (Pratt et al., 2020). For qualitative research, the rich reporting of evidence inside the article can assist the reader in verifying quality. Second, stakeholder pressures on science in general, and on social sciences in particular, imply that transparency is – even more than in the past – expected, and sometimes even legally imposed.

Enabling Cumulative Knowledge and Learning

Scientific knowledge is the result of a cumulative process of generating and connecting data gleaned from individual studies, and replication is an essential part of the cycle (e.g., Aguinis, Cascio & Ramani, 2017; Bettis, Ethiraj, Gambardella, Helfat, & Mitchell, 2016a; Tsang & Kwan, 1999; Walker, Brewer, Lee, Petrovsky, & van Witteloostuijn, 2019). Researchers are only able to build on prior work if they can fully understand the underlying empirical mechanics. Independent researchers with access to all data and codes necessary to rerun statistical analyses can both reproduce the results, and extend the analysis. Statistics and robustness tests can bolster validity (cf. Meyer, Beugelsdijk, & van Witteloostuijn, 2017), but empirical support from other researchers is even more powerful as the external validity of a line of research arises not from standalone studies, but from related ones (Rousseau, Manning, & Denyer, 2009; van Witteloostuijn, 2016). Full transparency is a powerful mechanism to reduce hypothesizing after results are known (HARKING), p-hacking and other questionable research practices that inflate levels of significance in empirical studies (Meyer et al., 2017; Nielsen, Eden, & Verbeke, 2020).

A lack of replication studies is a substantive concern in many fields, including business and management (see, e.g., Bettis et al., 2016; Honig et al., 2018; van Witteloostuijn, 2016). We need to be able to assess the validity of past research to push forward with new. In fact, attempts at replication often fail to confirm widely accepted findings (Chang & Li, 2018; Duvendack, Palmer-Jones, & Reed, 2017), to the extent that when Aarts et al. (2015) looked at the replication of 100 studies published in 2008 in highly ranked psychology journals, they uncovered a confirmation rate of only about 39%. Similarly, Camerer et al. (2018) find a significant effect in the same direction of the original study for 62% of the studies, and the effect size of the replications is on average about 50% of the original effect size. Credibility of research results is fundamental to building a cumulative stock of knowledge, and by extension to developing theories that explain and predict phenomena that matter to individuals, firms, and governments.

Replication studies have recently received more attention, notably in psychology (e.g., Nosek, 2015). The Strategic Management Journal published a special issue on replication in 2017 to emphasize a shift in editorial policy in favor of replication studies (Bettis, Helfat, & Shaver, 2016). Lack of transparency also inhibits meta-analyses, which require clarity on how the original data have been handled and analyzed, including correlation matrices and descriptive statistics (although not the datasets themselves). Yet, many published articles do not include basic data information and so cannot be included in meta-analyses. The DART policies we propose should facilitate such studies.

Still another benefit of full transparency is that datasets can be used in training PhD students and junior researchers. Replication of high-quality studies can showcase best practice methodologies that aspiring researchers should emulate. Indeed, regardless of how experienced we are, we can all learn from such studies, and in the end the quality of manuscripts submitted – to JIBS and other journals – will be the better for it.

Legitimacy of Scholarship in Society

A significant part of contemporary academic research is at least partly funded by taxpayers. The academic community has an obligation to back its “product”. The funders of research, public or private, want to know if their money has been well spent – if the research yielded trustworthy results and has practical implications. If research methods are flawed, the policy recommendations based on it will be too. Intellectual property is another issue. Some argue that the results of publicly funded research should be public property, and that governments should impose rules and regulations to make sure that it is shared (Tenopir et al., 2011).

What about us? Despite the popular notion of the absent-minded, unselfish academic, we live in the real world like everyone else. There are conflicting demands on our time. We grapple with institutional and academic demands, and answer to the universities where we work. We have obligations toward students and fellow scholars, journal editorial boards and reviewers, journalists and politicians, not to forget practicing managers (Rynes, Colbert, & O’Boyle, 2018; Wiklund, Wright, & Zahra, 2018). Yes, there is pressure. The overwhelming majority manage to handle it. Some do not. There have been scandals, such as the manipulation of results by US food and nutrition researcher Brian Wansink (New York Times, 2018), and the fabrication of fake datasets by Dutch psychologist Diederik Stapel (New York Times, 2011). Papers by Ulrich Lichtenthaler, James E. Hunton, and David de Geest have been retracted by major business and management journals, inter alia because they exaggerated effect sizes or levels of significance.11 An overview of these cases can be found on www.retractionwatch.com and its associated searchable database www.retractiondatabase.org. JIBS has not been pulled into any such scandal. That does not mean that it, or any other journal, should feel unconcerned. The issue must be confronted.

These legitimacy and transparency pressures are not uniformly spread across the globe, but are contingent on the institutional setting in which scholars operate (cf. DiMaggio & Powell, 1983). They arise from evolving country-specific norms as reflected in the guidelines of major research agencies. US, Canadian, British, Chinese, and Dutch scholars are responsible for more than 90 percent of all the articles that were published in JIBS between 1970 and 2016 (Verbeke & Calma, 2017: Table 2), so we summarize in Table 2 the DART policies of the major research agencies of those countries and of the European Union.12

Table 2 Data transparency policies of research sponsors

The US National Science Foundation (US-NSF) and the European Research Council (ERC) of the European Union expect authors to make data available, but allow them to opt out. The German Research Foundation (DFG) and the Japanese Society for the Promotion of Sciences (JSPS) also allow opting out if done on the grounds of data protection rights or copyright. The Economic and Social Research Council (ESRC) of Britain and the Organization for Scientific Research (NWO) of the Netherlands require that all data be made available. The Social Services and Humanities Research Council of Canada (SSHRC) and the National Natural Science Foundation (C-NSF) do not explicitly take a position on making data available, but rather call on authors to follow the standards set in their fields. The extent to which such funding agencies actually enforce what they publish on their website is hard to assess, but they do have different policies, which shows that DART requirements vary across the world.

The JIBS approach towards DART must be adaptable if it is to withstand differing and ever-changing environments. In the next section, we describe various legal, ethical, and organizational challenges posed by DART.

FULL DATA TRANSPARENCY CHALLENGES

Legal and Ethical Concerns

Full data transparency is an ideal of positive science, but as we acknowledge above, it is not always achievable, nor desirable. In particular, high-quality datasets need to be created in the first place, thus we need to look at what promotes – and what may undermine – that process. There are three stumbling blocks that come up again and again when IB scholars discuss DART (see Lupia & Alter, 2014, for similar observations in political science): (i) legal constraints, (ii) development sunk costs, and (iii) privacy and confidentiality.13

First, some datasets are created by researchers themselves, whereas other ones are in the public domain, and some are licensed. The owners of licensed datasets set the terms of use, which can range from being quite liberal to not allowing any sharing or disclosure of the original data. Data ownership is protected by intellectual property laws, meaning that the owners have the last word.

Next, the creation of a dataset is a long and time-consuming task. Some datasets are the joint effort of many persons in large organizations, and sometimes a single researcher or small group of them may devote years to the effort. A researcher may hope to recoup sunk costs by publishing several papers using their dataset knowing full well that the realization of a stream of research takes considerable time – the time to write papers, of course, but also the time it takes to have them go through the publishing process, involving revisions and possibly rejections and further revisions. The investment is large, and it is risky. There is no guarantee that competing papers will not be published in the meantime. Still, as researchers they see the creation of a dataset as an investment in their career. They are likely to see a policy that calls for early sharing of data as detrimental to this career. There would be nothing to stop others from using the fruits of their labor to write competing articles. Indeed, a database developer might argue that full transparency beginning with the first paper actually increases the likelihood of competing papers. If DART were to undermine the hand-collection of data, this could hardly be qualified as scientific progress.

Moreover, ethical considerations compel researchers to ensure the privacy of those from whom they obtained data, and rightfully so. In particular, they may promise confidentiality to would-be survey respondents or to interviewees to obtain informed consent and to ensure unbiased responses. The obligation to protect vulnerable subjects is universally accepted by all scholars. And new and strict privacy laws, like those in the EU, explicitly impose this obligation upon the academic community. Monroe (2018) discusses the ethical ramifications of DART policies in political science, in particular the importance of confidentiality when working with refugees and with individuals subject to repressive regimes. Management researchers face similar ethical issues. The wide-ranging implications of sharing data include, just to name two of them, the possibility of repercussions for interviewees who divulge organizational information and the potential negative responses from the “accused” organizations and individuals when making public revelations of unethical behavior.

The legal constraints, sunk costs, and confidentiality concerns we describe above, are each associated in a different way with data commonly used in international business research. Lack of established processes and institutional support for data management is a shared concern of all scholars regardless of the types of data with which we work. We address these organizational challenges for different types of data commonly used in IB research.

  1. 1.

    Data downloaded from publicly available databases: Country-level data may be available from public databases such as those of the World Bank and the International Monetary Fund. In such cases, there are no substantive barriers to making data available, and a precise reference in the bibliography often suffices. Version number and date of retrieval must be specified, because databases of this kind are frequently updated.

  2. 2.

    Data hand collected and coded (including coded archival data and original survey data): Time, effort, and financial resources go into original data collection. In the case of publicly available data, selecting and entering data can be arduous if what is needed is not available in pre-processed electronic form. This is especially the case for primary data, either based on coding of documents (e.g., company annual reports) or generated by questionnaire survey. Collecting data incurs sunk costs, which researchers usually want to amortize through a series of related research projects. They therefore need mechanisms to prevent others from pre-empting them. There is no doubt that a first-mover advantage is critical because journals generally seek novelty and show little interest in publishing a second paper with the same analysis, even from the author who compiled the original dataset.

    In addition, scholars often need to protect participating research subjects and make confidentiality promises when collecting survey data, not only to secure satisfactory response rates and unbiased responses, but also to satisfy ethics committees and the law. Similarly, confidentiality challenges arise in handling data the use of which has been expressly authorized, such as firm archival information, turnover and profit data, employee personal demographic information and so on. The need for transparency must therefore be balanced with ethical demands for confidentiality. Hence the best solution may be not to require the sharing of original field data, but to ask for sufficiently detailed documentation, and – if feasible – for an anonymized version of the dataset, to allow scrutiny of results by an acting editor or critical reviewers.

  3. 3.

    Data used under license from a commercial data provider: Many studies in finance, economics, and strategy use commercial databases such as those of Bureau van Dijk (e.g., Estrin, Meyer, & Pelletier, 2018; Useche, Miguelez, & Lissoni, 2019), Toyo Keizai (e.g., Delios & Beamish, 1999; Stallkamp et al., 2018) and WIND (a database of Chinese firms used by e.g., Liang, Ren, & Sun, 2015), to mention a few. In such cases, the copyright is held by a privately owned organization that provides access to its data by subscription to authors’ universities. Sharing the data with other scholars would violate that copyright and the contract under which the researchers or their university accessed the data. This said, while providing public access to the original data is not legally possible, scholars are usually not prohibited from sharing intermediate outputs related to their data processing, including downloading routines, descriptive statistics, software codes for regression analysis, and software output. The inclusion of descriptive statistics in a paper is established practice; but sharing other intermediate outputs of the research process is not yet common in our field. One of the main challenges with these types of databases is that many scholars using them have limited substantive knowledge on the firms in these databases. This absence of substantive knowledge of the authors, places an extra burden on editors and reviewers to ascertain that the results of hypothesis testing represent more than statistical noise.

  4. 4.

    Data available only on a remote computer: Some data providers do not allow datasets to be removed from their premises but permit researchers to analyze data on the organization’s computers, while others have researchers submit their codes so that in-house staff can analyze data for them. This is done to preserve the confidentiality of the data. Providers who permit dataset access with these kinds of usage restrictions include the US Bureau of Economic Analysis (BEA) with data on, inter alia, the foreign operations of US multinational firms, which have been used by Berry & Kaul (2015) and by Feinberg & Gupta (2004). Other examples are a number of national banks, and national and Pan-European statistical offices – e.g., publishers of the Community Innovation Survey used by Laursen & Salter (2006) and Schubert, Baier, and Rammer (2018), and the German IWH Research Institute with survey data on foreign affiliates used by Beugelsdijk & Jindra (2018) and Santangelo, Meyer, & Jindra (2016). In these cases, researchers cannot share raw data, because they do not have access to these.

  5. 5.

    Data generated by lab experiments: The number of experimental design-based studies in JIBS has increased in recent years. Some examples include Buckley, Devinney, & Louviere’s (2007) study of location choices by managers, Magnusson, Westjohn, & Sirianni’s (2018) experiment on stereotyping, and Allred, Findley, Nielson, & Sharman’s (2017) field experiment on shell companies and auditing. The established standards for experimental studies have changed rapidly and the JIBS editors team endorses new standards established in neighboring fields, specifically pre-registration (Nosek et al., 2018), but does not impose them at this time. Allred, Findley, Nielson, & Sharman (2017: 599) write in their JIBS article that they have pre-registered their study at www.egap.org.

  6. 6.

    Data generated through qualitative research (e.g., interviews or observations): Scholars conducting qualitative research in particular experience significant tensions between the quest for transparency and their ethical obligations vis-à-vis research subjects (Pratt et al., 2020). Unlike much quantitative research using secondary data, qualitative field research is intrinsically built on the notion of confidentiality. In business and management, data collection is often done via interviews and other forms of direct interaction with employees, managers, and other firm stakeholders. A subject’s full confidence in the researcher is integral to data reliability. Should a policy be imposed that requires that all data be made available, there is a real risk of response and selection biases in interviews, surveys, participation-observation sessions, and critical discourse analyses (e.g., Monroe, 2018; Pratt et al., 2020). The number of voluntary participants would also, in all likelihood, decline. In the end, it might make this kind of research impossible.

Research institutions scrutinize plans for collecting data from human subjects, and ethics review boards are charged with approving research protocols before data are collected. This practice has evolved rapidly in recent years, and is now mandated in many countries by funding agencies or by law. At the same time, 2018 changes at the EU level in data protection and privacy law compel European universities to develop a data management plan according to which institutional ethics committees normally require researchers to specify in detail the kind of data to be collected, where and how these will be obtained, analyzed and stored. Such a plan typically includes a description of interview protocols, as well as how informants are to be recruited, a description of possible risks and benefits to them, procedures that will safeguard their privacy and confidentiality, the consent forms they will be asked to sign, and details on any compensation to be provided. The plan should also describe data retention practices.

Researchers working with qualitative field data thus face distinct challenges and should adopt best practices with respect to establishing the trustworthiness of their findings and the cumulative creation of knowledge (Pratt et al., 2020). As a basic requirement, scholars in the qualitative research tradition need to organize and describe the research process they have followed in a way that allows others to determine how conclusions have been derived (Cuervo-Cazurra, et al., 2016). However, qualitative research itself follows a variety of different research ontologies and processes. In IB, case studies have traditionally been most common, but other forms of qualitative work are gaining ground (Birkinshaw, Brannen, & Tung, 2011; Welch, Piekkari, Plakoyiannaki, & Paavilainen-Mäntymäki, 2011), e.g., ethnographic studies such as those by Moore (2011) and Westney & Van Maanen (2011). Authors who wish to submit to JIBS a manuscript with qualitative field data should normally have data collection and analysis procedures approved by the ethics board of their institution, and in the comply-or-explain statement, they should explain their data management decisions.

We think that international business and management research thrives in good part on its diversity of data and methodologies. At the same time, methodological diversity makes it challenging to develop an all-encompassing DART policy. We do not want policies that might stymie the creation of data, or that would discourage researchers from using a particular methodology or from addressing a particular topic.

Organizational Challenges

To implement a successful DART policy we need practical solutions to the issues we have discussed.14 Full transparency requires user-friendly datasets in permanent repositories (Alvarez, Key, & Núñez, 2018), preferably with a Digital Object Identifier (DOI) name to the data. After all, datasets need to be found before they can be used. Authors need to select meaningful filenames, provide data in standard formats, and include variable definitions and codebooks. Storage too needs to be addressed. Traditionally, academic data have been stored on personal websites or on university servers, as illustrated by the examples given earlier, but neither is a fully satisfactory, long-term solution as there are many reasons why a personal website might not be maintained. A change in the university affiliation of a researcher might also result in a loss of continuity.

Every published article is identified and linked to the web by a DOI. Similarly, a DOI can provide a stable and permanent link to data. Just as a journal article DOI is included in the formal bibliography, so should the data DOI – and not just among acknowledgements or in an endnote. This benefits researchers who develop a dataset, to the extent that articles associated by a bibliographic link to publicly available data receive more citations (Drachen & Ellegaard, 2016). Other researchers may then subsequently reference an article separate from the data to acknowledge a dual contribution. Datasets should be cited as in the following example with data from the late Raymond Vernon, Multinational Enterprise Project (published in 2015), https://doi.org/10.7910/DVN/27846, Harvard Dataverse, V5. It is important that in addition to author(s), year, database name, and DOI, the provider of access to the metadata would also be specified (in this example, Harvard Dataverse) and data version (especially in the case of data subject to change).

Many storage facilities provide DOIs (see https://repositoryfinder.datacite.org for an overview of 570 repositories in the humanities and social sciences as of August 1, 2019), but they are not all equally reliable. Elsevier uses its own system (www.elsevier.com/solutions/mendeley), while other journals, and individuals as well, have an alliance with the DRYAD Digital Repository (https://datadryad.org), or the Harvard Dataverse housed at the Harvard Institute for Quantitative Social Science (https://dataverse.org).

Web of Science, generally known as the source of journal impact factors, launched a Data Citation Index in 2012 to provide a single point of access to research data in repositories across disciplines and around the world. The storage provider recommends that authors rely on a data repository plan commonly used in their discipline. For business and management scholars publishing in a variety of business and management journals, it may be difficult to decide which one to use.15Web of Science recommends that authors follow the recommendations of the journal in which they are publishing (Clarivate Analytics Web of Science, 2017). The Web of Science Data Citation Index provides a list of 21 storage providers that are considered reliable, including the previously mentioned Dryad and Harvard Dataverse, but also Peking University’s Open Research Data Platform and the UK Data Archive.

In addition to raw data, some authors share a code file that describes the entire process from that raw data through all intermediate steps, i.e., transformation of the data, treatment of outliers and alternative estimation techniques, to final regression results. Such guidance on code sharing, as well as a clear and detailed description of the statistical software used (as we set out above), helps with replication. The more clarity provided, the more effective DART policies.

DATA MANAGEMENT STEWARDSHIP

Data access and research transparency expectations in social science are rapidly evolving. There have been advances in infrastructure, i.e., the establishment of data repositories, and new protocols and standards on how data should be stored on them. Science foundations around the world have implemented a variety of DART requirements, and many initiatives at the national and supra-national level aim to provide individual researchers with the necessary tools and information to respond to the new opportunities. We have seen flagship journals in economics, sociology, psychology, and political science put in place full access requirements. Journals in business and management are catching up quickly in initiating and implementing DART policies. This editorial outlines a route forward for JIBS.

International business is multidisciplinary, characterized by a variety of research questions, epistemologies, and methodologies. This is reflected in the types of data used, including experimental, survey, interview, and archival data (both copyrighted and publicly available) in quantitative and qualitative form. This adds to the complexity of developing a DART policy. The new approach outlined in this editorial aims to enhance transparency while being flexible and method-sensitive in its implementation.

Notes

  1. 1

    The DART acronym was coined by a group of political scientists associated with the American Political Science Association. For more information, we refer to https://www.dartstatement.org/, and Lupia and Elman (2014).

  2. 2

    https://documents.aib.msu.edu/policies/AIB-Journals-Code-of-Ethics-20180209.pdf.

  3. 3

    See http://dow.net.au/?page_id = 29 31-Oct-2019.

  4. 4

    See https://lauder.wharton.upenn.edu/resources-publications 31-Oct-2019.

  5. 5

    See https://www.ivey.uwo.ca/internationalbusiness/research/hasslefactor/ 31Oct-2019.

  6. 6

    Available at: https://resource-cms.springernature.com/springer-cms/rest/v1/content/13353644/data/v4 31Oct-2019.

  7. 7

    The Academy of Management discusses fine slicing of data in a series of Ethics of Research and Publishing videos: see the ethicist blog on https://aom.org/ethics.

  8. 8

    https://www.apa.org/pubs/journals/apl/data-transparency-appendix-example.aspx (jan 24 2019).

  9. 9

    Based on personal communication with Neil Morgan, editor of the Journal of Marketing.

  10. 10

    https://aom.org/About-AOM/AOM-Code-of-Ethics.aspx.

  11. 11

    A retraction by Schminke and Ambrose from Management and Organization Review because of duplication, has been used by the editors and authors as an opportunity to discuss the reasons as to how this could have happened. The case now serves as a useful illustration of how transparency can be used as a learning opportunity (see Tsui & Lewin, 2014, for the discussion).

  12. 12

    Many of these agencies participate in the Global Research Council (GRC), a virtual organization comprised of the heads of the science and engineering funding agencies from around the world, including the Americas, Europe, Asia–Pacific, Sub-Saharan Africa, and the Middle East and Northern Africa. GRC was established in 2012 based on an initiative of the United States National Science Foundation. The GRC endorsed an action plan regarding open access research in 2013, stating that open access needs to be stimulated while taking institutional and national differences into account. See https://www.globalresearchcouncil.org/fileadmin//documents/GRC_Publications/grc_action_plan_open_access_FINAL.pdf 19-jan-2019.

  13. 13

    We introduced the discussion on data access and transparency at the JIBS editorial review board meeting at the Academy of International Business conference meeting in Minneapolis-St. Paul in 2018.

  14. 14

    In addition to the organizational aspects, there are legal issues that we do not discuss here, because our discussions with specialists have not given us clear guidance on the legal aspects of DART. The core of the legal issues concerns the physical storage of the data and the legal implications of the location of the server associated with the data repository. For example, under the US Patriot Act, the US government has access to all data stored on a server located inside the US, even if the non-American (e.g. Chinese) author has used, for example a Dutch repository of which the servers are located in the United States.

  15. 15

    One institutionalized source of trust in data repositories, is provided by the CoreTrustSeal, a community based non-profit organization providing certification of data repositories operating under Dutch law (see https://www.coretrustseal.org).

REFERENCES

  1. Aarts, A. A., et al. 2015. Estimating the reproducibility of psychological science. Science 349(6251). https://doi.org/10.1126/science.aac4716.

  2. Aguinis, H., Cascio, W. F., & Ramani, R. S. 2017. Science’s reproducibility and replicability crisis: International business is not immune. Journal of International Business Studies, 48: 653–663.

    Google Scholar 

  3. Aguinis, H., Ramani, R. S., & Alabduljader, N. 2018. What you see is what you get? Enhancing methodological transparency in management research. Academy of Management Annals, 12: 83–110.

    Google Scholar 

  4. Allred, B. B., Findley, M. G., Nielson, D., & Sharman, J. C. 2017. Anonymous shell companies: A global audit study and field experiment in 176 countries. Journal of International Business Studies, 48(5): 596–619.

    Google Scholar 

  5. Alvarez, R. M., Key, E. M., & Núñez, L. 2018. Research replications: practical considerations. Political Science and Politics, 51(2): 422–426.

    Google Scholar 

  6. Anderson, B. S., Wennberg, K., & McMullen, J. S. 2019. Enhancing quantitative theory-testing entrepreneurship research. Journal of Business Venturing, 34(5): 105928.

    Google Scholar 

  7. Berry, H., Guillen, M., & Zhou, N. 2010. An institutional approach to cross-national distance. Journal of International Business Studies, 41: 1460–1480.

    Google Scholar 

  8. Berry, H., & Kaul, A. 2015. Global sourcing and foreign knowledge seeking. Management Science, 61(5): 1052–1071.

    Google Scholar 

  9. Bettis, R. A., Ethiraj, S., Gambardella, A., Helfat, C. E., & Mitchell, W. 2016a. Creating repeatable cumulative knowledge in strategic management. Strategic Management Journal, 37(2): 257–261.

    Google Scholar 

  10. Bettis, R. A., Helfat, C. E., & Shaver, M. J. 2016b. Special issue: replication in strategic management. Strategic Management Journal, 37(11): 2191–2388.

    Google Scholar 

  11. Beugelsdijk, S., & Jindra, B. 2018. Product innovation and decision making autonomy in subsidiaries of multinational enterprises. Journal of World Business, 53: 529–539.

    Google Scholar 

  12. Birkinshaw, J., Brannen, M. Y., & Tung, R. L. 2011. From a distance and generalizable to up close and grounded: Reclaiming a place for qualitative methods in international business research. Journal of International Business Studies, 42(5): 573–581.

    Google Scholar 

  13. Bluhm, D. J., Harman, W., Lee, T. W., & Mitchell, T. R. 2011. Qualitative research in management: a decade of progress. Journal of Management Studies, 48(8): 1866–1891.

    Google Scholar 

  14. Broad, W. J. 1981. The publishing game: getting more for less. Science, 211(4487): 1137–1139.

    Google Scholar 

  15. Buckley, P., Devinney, T., & Louviere, J. J. 2007. Do managers behave the way theory suggests? A choice theoretic examination of foreign direct investment location decision making. Journal of International Business Studies, 38(7): 1069–1094.

    Google Scholar 

  16. Camerer, C. F., et al. 2016. Evaluating Replicability of Laboratory Experiments in Economics. Science, 351(6280): 1433–1436.

    Google Scholar 

  17. Camerer, C. F., et al. 2018. Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nature Human Behaviour, 2: 637–644.

    Google Scholar 

  18. Chang, A. C., & Li, P. 2018. Is Economics Research Replicable? Sixty Published Papers From Thirteen Journals Say “Often Not”. Critical Finance Review. https://doi.org/10.1561/104.00000053.

    Article  Google Scholar 

  19. Clarivate Analytics Web of Science. 2017. Recommended practices to promote scholarly data citation and tracking. White Paper. Downloaded and accessed Dec, 1: 2018.

    Google Scholar 

  20. Colquitt, J. A. 2013. Data overlap policies at AMJ. Academy of Management Journal, 56(2): 331–333.

    Google Scholar 

  21. Cuervo-Cazurra, A., Andersson, U., Brannen, M.-Y., Nielsen, B. B., & Reuber, A. R. 2016. Can I trust your findings? Ruling our alternative explanations in international business research. Journal of International Business Studies, 47(8): 881–897.

    Google Scholar 

  22. Delios, A., & Beamish, P. W. 1999. Ownership strategies of Japanese firms: transactional, institutional, and experience influences. Strategic Management Journal, 20: 915–933.

    Google Scholar 

  23. DiMaggio, P. J., & Powell, W. W. 1983. The iron cage revisited: institutional isomorphism and collective rationality in organizational fields. American Sociological Review, 48(2): 147–160.

    Google Scholar 

  24. Dow, D., & Karunaratna, A. 2006. Developing a multidimensional instrument to measure psychic distance stimuli. Journal of International Business Studies, 37(5): 578–602.

    Google Scholar 

  25. Drachen, T. M., & Ellegaard, O. 2016. Sharing data increases citations. Liber quarterly Journal of the Association of European Research Libraries, 26(2): 67–82.

    Google Scholar 

  26. Duvendack, M., Palmer-Jones, R., & Reed, W. R. 2017. What is meant by “Replication” and why does it encounter resistance in economics? American Economic Review, 107(5): 46–51.

    Google Scholar 

  27. Eden, L. 2010. Letter from the editor in chief: scientists behaving badly. Journal of International Business Studies, 41(4): 561–566.

    Google Scholar 

  28. Eden, L. 2018. Academy of International Business Code of Ethics; updated version of the journals code of ethics first published in 2010, https://aib.msu.edu/ethics/.

  29. Elman, C., & Kapiszewski, D. 2014. Data access and research transparency in the qualitative tradition. Political Science and Politics, 47(1): 43–47.

    Google Scholar 

  30. Estrin, S., Meyer, K. E., & Pelletier, A. 2018. Emerging economy MNEs: How does home country munificence matter? Journal of World Business, 53(4): 514–528.

    Google Scholar 

  31. Feinberg, S., & Gupta, A. K. 2004. Knowledge spillovers and the assignment of R&D responsibilities to foreign subsidiaries. Strategic Management Journal, 25: 823–845.

    Google Scholar 

  32. Freese, J., & King, M. M. 2018. Institutionalizing transparency. Socius: Sociological Research for a Dynamic World, 4: 1–7.

    Google Scholar 

  33. Honig, B., et al. 2018. Reflections on scientific misconduct in management: unfortunate incidents or a normative crisis? Academy of Management Perspectives, 32(4): 412–442.

    Google Scholar 

  34. Kirkman, B. L., & Chen, G. 2011. Maximizing your data or data slicing? Recommendations for managing multiple submissions from the same dataset. Management and Organization Review, 7(3): 433–446.

    Google Scholar 

  35. Laursen, K., & Salter, A. (2006). Open for innovation: The role of openness in explaining innovation performance among UK manufacturing firms. Strategic Management Journal, 27: 131–150.

    Google Scholar 

  36. Lee, C. J., Sugimoto, C. R., Zhang, G., & Cronin, B. 2013. Bias in peer review. Journal of the American Society for Information Science and Technology, 64(1): 2–17.

    Google Scholar 

  37. Liang, H., Ren, B., & Li, Sun S. 2015. An anatomy of state owned control in the globalization of state-owned enterprises. Journal of International Business Studies, 46(2): 223–240.

    Google Scholar 

  38. Lupia, A., & Alter, G. 2014. Data access and research transparency in the quantitative tradition. Political Science and Politics, 47(1): 54–59.

    Google Scholar 

  39. Lupia, A., & Elman, C. 2014. Openness in political science: Data access and research transparency: Introduction. PS: Political Science & Politics, 47(1): 19–42. https://doi.org/10.1017/s1049096513001716.

    Article  Google Scholar 

  40. Magnusson, P., Westjohn, S. A., & Sirianni, N. J. 2018. Beyond country image favorability: How brand positioning via country personality stereotypes enhances brand evaluations. Journal of International Business Studies, 50(3): 318–338.

    Google Scholar 

  41. Makel, M. C., & Plucker, J. A. (Eds.). 2017. Toward a more perfect psychology: Improving trust, accuracy, and transparency in research. Washington DC: American Psychological Association.

    Google Scholar 

  42. Merton, R. K. 1973. The Sociology of Science: Theoretical and Empirical Investigations. Chicago: University of Chicago Press.

    Google Scholar 

  43. Meyer, K., van Witteloostuijn, A., & Beugelsdijk, S. 2017. What is in a p? Reassessing best practices for conducting and reporting hypothesis-testing research. Journal of International Business Studies, 48(5): 535–551.

    Google Scholar 

  44. Miguel, E., et al. 2014. Promoting transparency in social science research Science, 343: 30–31.

    Google Scholar 

  45. Monroe, K. R. 2018. The rush to transparency: DA-RT and the potential dangers for qualitative research. Perspectives on Politics, 16(1): 141–148.

    Google Scholar 

  46. Moore, F. 2011. Holistic ethnography: Studying the impact of multiple national identities on post-acquisition organizations. Journal of International Business Studies, 42(5): 654–671.

    Google Scholar 

  47. New York Times. 2011. Fraud case seen as a red flag for psychology research, November 2, 2011, http://www.nytimes.com/2011/11/03/health/research/noted-dutch-psychologist-stapel-accused-of-research-fraud.html?_r=1&ref=research. Retrieved 15 Jan 2017.

  48. New York Times, 2018. More evidence that nutrition studies don’t always add up, September 29, 2018, https://www.nytimes.com/2018/09/29/sunday-review/cornell-food-scientist-wansink-misconduct.html. Retrieved 2 Dec 2018.

  49. Nielsen, B., Eden, L., & Verbeke, A. 2020. Research methods in international business: challenges and advances. In B. Nielsen, L. Eden, & A. Verbeke (Eds.), Research methods in international business, vol. 7, pp. 3–41. London: JIBS special collections.

    Google Scholar 

  50. Nosek, B. A. et al. 2015. Promoting an open research culture. Science 348(6242): 1422–1425.

  51. Nosek, B., Ebersole, C. R., Dehaven, A. C., & Mellor, D. T. 2018. The preregistration revolution. Proceedings of the National Academy of Sciences of the United States of America, 115(11): 2600–2606.

    Google Scholar 

  52. Popper, K. 1962. Conjectures and Refutations: The Growth of Scientific Knowledge. New York: Basic Books.

    Google Scholar 

  53. Pratt, M. G., Kaplan, S., & Whittington, R. 2020. Editorial essay: The tumult over transparency: decoupling transparency from replication in establishing trustworthy qualitative research. Administrative Science Quarterly, 65(1): 1–19.

    Google Scholar 

  54. Rousseau, D. M., Manning, J., & Denyer, D. 2009. Evidence in management and organizational science: Assembling the field’s full weight of scientific knowledge through syntheses. Academy of Management Annals, 2(1): 475–515.

    Google Scholar 

  55. Rynes, S. L., Colbert, A. E., & O’Boyle, E. H. O. 2018. When the ‘best available evidence’ doesn’t win: How doubt about science and scientists threaten the future of evidence-based management. Journal of Management, 44(8): 2995–3010.

    Google Scholar 

  56. Santangelo, G. D., Meyer, K. E., & Jindra, B. 2016. MNE subsidiaries’ outsourcing and insourcing of R&D: The role of local institutions. Global Strategy Journal, 6: 247–268.

    Google Scholar 

  57. Schubert, T., Baier, E., & Rammer, C. 2018. Firm capabilities, technological dynamism and the internationalization of innovation: A behavioral approach. Journal of International Business Studies, 49: 70–95.

    Google Scholar 

  58. Schotter, A., & Beamish, P. W. (2013). The Hassle Factor: An Explanation for Managerial Location Shunning. Journal of International Business Studies, 44(5): 521–544.

    Google Scholar 

  59. Schwartz, S. H. 2006. A theory of cultural value orientations: Explication and applications. Comparative Sociology, 5: 137–182.

    Google Scholar 

  60. Stallkamp, M., Pinkham, B. C., Schotter, A. P. J., & Buchel, O. 2018. Core or periphery? The effects of country-of-origin agglomerations on the within country expansion of MNEs. Journal of International Business Studies, 49: 942–966.

    Google Scholar 

  61. Tsang, E. W., & Kwan, K. (1999). Replication and theory development in organizational science: A critical realist perspective. Academy of Management Review, 24(4): 759–780.

    Google Scholar 

  62. Tenopir, C., et al. 2011. Data sharing by scientists: practices and perceptions. PlosOne,6(6): e21101. https://doi.org/10.1371/journal.pone.0021101.

    Article  Google Scholar 

  63. Tsui, A. S., & Lewin, A. Y. 2014. Retraction statement for “Ethics and Integrity of the Publishing process: Myths, facts and a Roadmap” by Marshall Schminke and Maureen L. Ambrose. Management and Organization Review, 10(1): 157–162.

    Google Scholar 

  64. Useche, D., Miguelez, E., & Lissoni, F. 2019. Highly skilled and well connected: Migrant inventors in cross-border M&As. Journal of International Business Studies. https://doi.org/10.1057/s41267-018-0203-3.

    Article  Google Scholar 

  65. van Witteloostuijn, A. 2016. What happened to Popperian falsification? Publishing neutral and negative findings: Moving away from biased publication practices. Cross-Cultural and Strategic Management, 23: 481–508. (formerly known as Cross-Cultural Management).

    Google Scholar 

  66. Verbeke, A., & Calma, A. 2017. Footnotes on JIBS 1970-2016. Journal of International Business Studies, 48(9): 1037–1044.

    Google Scholar 

  67. Walker, R. M., Brewer, G. A., Lee, M. J., Petrovsky, N., & van Witteloostuijn, A. 2019. Best practice recommendations for replicating experiments in Public Administration. Journal of Public Administration Research and Theory, 29(4): 609–626.

    Google Scholar 

  68. Welch, C., Piekkari, R., Plakoyiannaki, E., & Paavilainen-Mäntymäki, E. 2011. Theorising from case studies: Towards a pluralist future for international business research. Journal of International Business Studies, 42(5): 740–761.

    Google Scholar 

  69. Westney, E., & Van Maanen, J. 2011. The casual ethnography of the executive suite. Journal of International Business Studies, 42(5): 602–607.

    Google Scholar 

  70. Wiklund, J., Wright, M., & Zahra, S. A. 2018. Conquering relevance: Entrepreneurship research’s grand challenge. Entrepreneurship Theory and Practice, 43(3): 419–436.

    Google Scholar 

Download references

ACKNOWLEDGEMENTS

We thank Christina Elsinga and Joeri van Hugten for their support, and ten of our fellow JIBS Editors, as well as the AIB Executive Board for their constructive comments. We thank Alain Verbeke for his attention to detail.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Sjoerd Beugelsdijk.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

APPENDIX: OVERVIEW OF SOCIAL SCIENCE JOURNAL POLICIES ON DATA TRANSPARENCY AND REPLICABILITY

APPENDIX: OVERVIEW OF SOCIAL SCIENCE JOURNAL POLICIES ON DATA TRANSPARENCY AND REPLICABILITY

Journal Transparency and replicability policy Nature of the policy
American Economic Review (AER) “It is the policy of the American Economic Association to publish papers only if the data used in the analysis are clearly and precisely documented and are readily available to any researcher for purposes of replication.” Compulsory and full access
“Authors of accepted papers that contain empirical work, simulations, or experimental work must provide, prior to publication, the data, programs, and other details of the computations sufficient to permit replication. These will be posted on the AEA website. The Editor should be notified at the time of submission if the data used in a paper are proprietary or if, for some other reason, the requirements above cannot be met.”
“For experimental papers, we have a more detailed policy, including requirements for submitted papers as well as accepted papers. We normally expect authors of experimental articles to supply the following supplementary materials (any exceptions to this policy should be requested at the time of submission): the original instructions[..] information about subject eligibility or selection [..] any computer programs, configuration files, or scripts used[..] the raw data from the experiment”.
Source: https://www.aeaweb.org/journals/aer/about-aer/editorial-policy and https://www.aeaweb.org/journals/policies/data-availability-policy
American Journal of Political Science (AJPS)
See also Quarterly Journal of Political Science for a similar policy
“The corresponding author of a manuscript that is accepted for publication in the American Journal of Political Science must provide replication materials that are sufficient to enable interested researchers to reproduce all of the analytic results that are reported in the text and supporting materials.” Compulsory and full access
“If a manuscript is tentatively accepted for publication, the replication materials will be verified to make sure that they do, in fact, reproduce all results that appear in the article and immediate supporting materials before final acceptance and publication.[] If there are limitations or restrictions on data access or if an exception to the general replication policy will be requested for any reason, then the author should contact the AJPS Editor to explain the situation before submitting the manuscript. Exceptions to the AJPS replication policy will be granted at the discretion of the Editor.”
Source: https://ajps.org/guidelines-for-manuscripts and https://ajps.org/ajps-replication-policy/
American Political Science Review (APSR) “Researchers have an ethical responsibility to facilitate the validation of their evidence-based claim so that their work can be fully evaluated, including through reproduction and replication when appropriate, or b providing sufficient evidence and material to permit others to develop their own interpretation. This involves providing access to the data or evidence underlying their analysis, and achieving transparency in both the production and analysis of evidence All relevant materials must be made available.” Compulsory and full access
Source: http://www.apsanet.org/APSR-Submission-Guidelines
American Sociological Review (ASR) “All persons who publish in ASA journals are required to abide by ASA guidelines and ethics codes regarding plagiarism and other ethical issues. This requirement includes adhering to ASA’s stated policy on data-sharing: Sociologists make their data available after completion of the project or its major publications, except where proprietary agreements with employers, contractors, or clients preclude such accessibility or when it is impossible to share data and protect the confidentiality of the data or the anonymity of research participants (e.g., raw field notes or detailed information from ethnographic interviews) (ASA Code of Ethics, 1997).” Compulsory and Full access
Source: https://uk.sagepub.com/en-gb/eur/american-sociological-review/journal201969#submission-guidelines
International Organization (IO) See also International Studies Quarterly for a similar policy “At the time of final submission of accepted papers, IO requires authors of papers using quantitative data to provide the data set and accompanying command files to reproduce tables presented in the paper and any other specifications referenced in it (for example, results verbally described in the main text or in footnotes). Quantitative results will be replicated by IO staff and the paper will not begin the publication process until all results are confirmed. You do not need to send quantitative data until your article is accepted.” Compulsory and full access
Source: https://www.cambridge.org/core/journals/international-organization/information/instructions-contributors
Journal of Applied Psychology (JAP) “Authors are encouraged to make their data, materials, and/or preregistration plans and analyses publicly available, if possible, by providing a link to a third-party repository, such as APA’s own repository, in the author note and including the data citation in your reference list. Making your data and materials publicly available can increase the impact of your research, enabling future researchers to incorporate your work in model testing, replication projects, and meta-analyses, in addition to increasing the transparency of your research. APA’s data sharing policy does not require public posting, so it is at your discretion to decide what is best for your project in terms of public data, materials, and conditions on their use. Please note that APA policy does require authors to make their data available to other researchers upon request, per the APA Ethical Principles of Psychologists and Code of Conduct, as detailed in the section on Sharing Research Data for Verification.” Upon request and full access
Source: https://www.apa.org/pubs/journals/apl/?tab=4
Journal of Finance (JF) “Authors of accepted papers that contain empirical analysis, simulations, numerical computations, or experimental work must provide the programs needed for replication to the Journal of Finance. Authors are also encouraged to include the data along with the source code if public posting of the data does not violate copyright or confidentiality agreements. If the authors choose not to provide the data, they must include a pseudo-data set that illustrates the format of the files read by the code so that users can understand and check the functionality of the code.”[..] Absent an exception granted by the Editor, accepted papers will be published only after the programs are received by the Journal, and code will be made available on the Journal website. [..] If an exemption is granted, it will be noted on the published paper that the authors have been granted an exemption from the code sharing policy.” Compulsory and partial access
Source: https://c.ymcdn.com/sites/afajof.site-ym.com/resource/resmgr/files/Submission_docs/CodePolicy.pdf
Journal of International Economics (JIE) “This journal requires and enables you to share data that supports your research publication where appropriate, and enables you to interlink the data with your published articles. Research data refers to the results of observations or experimentation that validate research findings. To facilitate reproducibility and data reuse, this journal also encourages you to share your software, code, models, algorithms, protocols, methods and other useful materials related to the project.” Voluntary
Source: https://www.elsevier.com/journals/journal-of-international-economics/0022-1996?generatepdf=true
Journal of Politics (JOP) “Authors of quantitative papers must submit their data and all associated replication files to the JOP Dataverse.[..] A final decision of “Accept” will not be provided until all relevant materials are submitted to Dataverse.” Compulsory and full access
Source: http://www.thejournalofpolitics.org/conditional-accepts.html
  1. This table was developed based on an assessment of these journals in November 2018. Journals are listed in alphabetical order.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Beugelsdijk, S., van Witteloostuijn, A. & Meyer, K.E. A new approach to data access and research transparency (DART). J Int Bus Stud 51, 887–905 (2020). https://doi.org/10.1057/s41267-020-00323-z

Download citation

Keywords

  • transparency
  • data access
  • knowledge accumulation
  • open science
  • research methods