In the field of evidence-based management, the academic-practice gap is well-known and undermines the transfer of scientific findings to evidence-based decision-making. In this paper, we introduce the practical approach of community augmented meta-analysis (CAMA) that serves to ease cumulative evidence formation and dissemination in psychology. CAMA rests on the conviction that a better and faster knowledge transfer requires an increase in the efficiency and quality of evidence integration, timely publication of results, and a broad and easy accessibility. As a potential solution, we describe the platform PsychOpen CAMA that enables the publication of and access to meta-analytic data. We conclude with an empirical example that describes the implementation and presentation of a meta-analysis on gender differences in the intention to start a business using PsychOpen CAMA. Finally, we discuss benefits and potentials of the publication of meta-analytic data on the platform, as well as current methodological and technical limitations.
1 Evidence-based management and meta-analyses
Since its conception, evidence-based management (EBMgt) has been focused on using the best available evidence to inform decision-making in management practice (Rousseau 2006) as there is still a certain reluctance to base management decisions on scientific knowledge (Briner and Rousseau 2011). While the role of meta-analysis as a cumulative approach was always valuable, there is a recent discussion about the reproducibility of findings in management research (Bergh et al. 2017). In this regard, scholars noted that a first step towards more valid and generalizable evidence in organizational psychology and management is the systematic use of research syntheses (Briner and Rousseau 2011). The availability of systematic reviews and meta-analyses also facilitates the collection and assessment of relevant information on the side of practitioners and can thus play a crucial role to bridge the gap between research and organizational practice (Le et al. 2007).
While meta-analyses in management research and organizational sciences have always experienced a strong consideration and despite some well-received guidelines (Geyskens, et al. 2009), descriptions of methodological procedures are often inadequate and reporting non-transparent (Aytug et al. 2012), which may limit their validity and replicability. Thus, we argue that the collaborative effort of cumulative collection, transparent publishing, assessing, and updating evidence in management science and organizational psychology can be improved to foster evidence based decision-making. In the field of psychology, the platform PsychOpen CAMA (https://cama.psychopen.eu/) has been created as tool that enables the research community to curate and update meta-analyses. CAMA is a tool for psychology and related fields and can also be of interest for management researchers doing research with a focus on psychological concepts.
From this perspective, CAMA should be of interest for the field of human management with a focus on human behavior in social institutions and organizations (Nicholson 1998). An apparent example is Human Resource Management, that aims at diagnosing and developing competencies (Schaper 2004) and to effectively manage the available human resources within an organization (Combs et al. 2006). A popular topic in the field of personnel psychology are personal traits of the entrepreneur, as the Big Five (Zhao and Seibert 2006) and risk taking behavior (Stewart and Roth 2001), and their relevance for entrepreneurial success. For long-term performance and development of organizations, reacting to changing environments, learning and adaptation processes have to be considered (Chang et al. 2014).
In the present article, we present and overview the principles and functionalities of PsychOpen CAMA. By doing so, we hope to increase the awareness of its potential use in management. In the following, we will discuss requirements for open, reproducible, and current evidence. We start with discussing the concept of FAIR (Findable, Accessible, Interoperable, and Re-usable) data, the necessity of evidence-aggregating infrastructures and consider the rapid accumulation of research findings in recent years. We introduce community-augmented meta-analysis (CAMA) as a concept to account for open data policies and systematic and efficient updating at the same time. In this regard, we present the platform PsychOpen CAMA. To illustrate its functionalities, we will demonstrate its procedures using an empirical example of a recently published meta-analysis on gender differences in the intention to start a business (Steinmetz et al. 2021). Necessary steps to publish a meta-analysis on the platform are explained, and available outputs on the user interface are demonstrated. Finally, benefits and technical and methodological limitations are discussed.
2 Open, reproducible and extendable meta-analyses
2.1 FAIR meta-analytic data and infrastructures
Despite the fact that meta-analytic data is typically extracted from published primary studies and consists of study characteristics and summarized outcomes that are not subject to data protection concerns, meta-analyses in organizational sciences often fail to meet common standards for transparent reporting (Schalken and Rietbergen 2017). As a response to this, Lakens et al. (2016) argue for open meta-analytic data to make meta-analyses dynamic and reproducible. However, open data is not sufficient. Haddaway (2018) calls for open synthesis, which is the application of open science principles to evidence synthesis. In their practical guide on how to conduct meta-analyses, Hansen et al. (2022) explicitly recommend the use of open science reporting practices to allow validation of results and the use of already coded data in subsequent meta-analyses to contribute to cumulative science. Therefore, various templates, open science repositories, and even dynamic systems, including PsychOpen CAMA are mentioned.
According to the principles of the Open Science Movement (Kraker et al. 2011), next to coded data, open syntheses should provide information on the methodology in sufficient detail to allow verification and replicability, open programming code and tools, as well as open access to all relevant information. This would allow the research community to replicate, re-use, and update meta-analyses more efficiently and prevent ambiguities and questionable practices in research, as literature selection and data collection could rely on sufficient information on previous work. Research infrastructures are needed to facilitate the accumulation of evidence by fostering FAIR data sharing. FAIR data support the readability of data both for machines and for humans (Schultes and Wittenburg 2019).
According to the FAIR principles, evidence syntheses require findability and accessibility of the data to optimize decision-making. To serve the purpose of providing information in practical contexts, the comprehensibility of results is highly relevant. A graphical user interface (GUI) providing visualizations of meta-analytic results including interpretation aids can enable users without proficient knowledge to get an overview on the evidence on a research question (Bosco et al. 2015). Plain Language Summaries (PLS) giving a summary of the existing evidence, in the tradition of Cochrane reviews (Langendam et al. 2013), can complement the GUI to make scientific knowledge accessible for decision-makers and the public.
The aim of interoperability is to enable machines and technical tools to understand and process new data automatically (Nilsson 2010). This can be achieved by common standards and consistency in data and metadata structures. Ideally, data can be represented in a simple and reusable structure and metadata describe the characteristics of a dataset (González Morales and Orrell 2018). Especially for evidence syntheses, interoperability is highly relevant, as it facilitates and accelerates efficiently accumulating evidence for a timely integration of new research findings in synthesized evidence. Therefore, a basic template for meta-analytic data and a set of metadata to describe the characteristics of a meta-analytic dataset should be used as a foundation for interoperability.
For researchers interested in replicating meta-analyses or re-using data from a published meta-analysis, data access and a thorough documentation of the underlying methodology is crucial (Aguinis et al. 2011). Available analysis scripts and technical tools facilitate the replication of published results. Additionally, existing data resources can be used for novel purposes, as subgroup analyses, or the modification of methodological decisions, such as estimation method or selection of moderator variables in the model (Lakens et al. 2016). Data infrastructures adhering to the FAIR data principles thus have the potential to improve the efficiency of collaborative evidence collection and at the same time, increase the usability and accessibility of information for decision-makers and the public.
2.2 Cumulative evidence collection and updating
Beyond the requirements of FAIR data and enabling infrastructures, meta-analyses are only valid for a specific period of time (Créquit et al. 2016). Without additional electronic material, a meta-analysis represents the cumulative evidence on a research question up to a certain point in time and may quickly become outdated as soon as new findings from primary studies are published or new methodological or statistical procedures are developed (Shojania et al. 2007). An example that demonstrates how important open meta-analytic data and frequent updates are, both in terms of efficient evidence accumulation, as well as concerning the validity and timeliness of meta-analytic results, is the replication and extension of a meta-analysis on family firm innovation (Block et al., 2022). As the meta-analytic data of the prior meta-analysis was not available, only 104 of the original 108 studies could be gathered and the data had to be coded from scratch. The replication at the same time was the basis for a thorough extension. The study sample was updated and a new methodological approach was used. This had a significant impact on the results and changed the overall conclusion of the meta-analysis, underlining the relevance of the update.
For systematic reviews, an update is defined as a new edition of a published review. It can include new data, new methods, or new analyses. An update is recommended if the topic is still relevant and new methods or new studies have emerged that could potentially change the findings of the original review (Garner et al. 2016). For example, Cochrane reviews are required to be updated every two years (Shojania et al. 2007) and Campbell reviews within five years (Lakens et al. 2016). Shojania et al. (2007) assessed the survival time of 100 reviews and conclude that within two years, almost one-fourth of the reviews was already outdated. Créquit et al. (2016) examined the proportion of available evidence on lung cancer not covered by systematic reviews between 2009 and 2015 with the finding that, in all cases, at least 40% of treatments were missing. As the number of publications is continuously growing (Bastian et al. 2010), we can expect survival times of reviews to become even shorter.
The ongoing accumulation of evidence informs researchers about the latest findings in a specific research area, for example, when the results are robust enough to no longer justify further research investment, at least without taking into account existing results and perhaps specific research gaps. A systematic review of cumulative meta-analyses (Clarke et al. 2014) reports many illustrating examples, speaking for the high relevance of cumulative research to enable more informed decisions and at the same time a more efficient distribution of research funds and efforts.
Above the ongoing accumulation and synthesis of evidence, the next goal is to effectively publish cumulative meta-analytic evidence to facilitate cumulative research synthesis and provide the best evidence for practical decision-making. The key challenge for the publication of meta-analyses, therefore, is to make the preexisting research reproducible and to allow updating meta-analyses by reusing the information that has been collected up to the point of the most recent meta-analysis to keep pace with the continuous publication of research findings. To transform meta-analyses into transparent and dynamic resources, data and methods have to be reported in a standardized and open manner. In addition, programming code, interactive tools on a GUI and PLS can improve the accessibility and comprehensibility of the accumulated evidence. In the following, a concept for a publication format that enables reproducible and dynamic meta-analyses will be presented.
2.3 The concept of community-augmented meta-analyses
Actually, a concept to approach a publication format for comprehensive, dynamic, and up-to-date evidence synthesis already exists. Community-augmented meta-analysis (CAMA; Tsuji et al. 2014), is a combination of an open repository for meta-analytic data and an interface offering meta-analytic analysis tools. Depending on their focus, the conceptualiztion and labeling of systems underlying similar ideas differs across scholars. For example, Créquit et al. (2016) call for living systematic reviews, that is, high-quality online summaries, that are continuously updated. Similarly, Haddaway (2018) proposes open synthesis. Some authors also speak of dynamic (Bergmann et al. 2018), or cloud-based meta-analysis (Bosco et al. 2015). Braver et al. (2014) describe an approach called continuously cumulating meta-analysis (CCMA) to incorporate and evaluate new replication attempts to existing meta-analyses.
The basis of a CAMA system, as shown in Fig. 1, is the data repository, where meta-analytic data contributions from researchers in specific research areas are stored. It serves as a dynamic resource and can be used and augmented by the research community to keep the state of research updated and accumulate knowledge continuously. Tools to replicate and modify analyses with these data are accessible via an open web-based platform, usually encompassing a graphical user interface. For example, examining moderator effects beyond the analyses presented in the original meta-analysis may be conducted. The available evidence from the meta-analyses archived in a CAMA can also be used to improve study planning. Estimates of the expected size of an effect can serve as input for power analyses. The examination of possible relevant moderators can help to identify research gaps and guide the design of new studies (Tsuji et al. 2014).
The meta-analytic data for the CAMA system have to follow certain standards to ensure interoperability with the functionalities on the GUI. That means, that analysis outputs can be requested by users and are automatically available on the GUI for each dataset, as the underlying analysis functions understand the standardized data. The platform thus serves as a dynamic resource enabling the research community to keep the state of research updated and accumulate knowledge continuously by providing a common language for the data. The role of the infrastructure provider is to set the standards for the data submitted to the repository and to store the data according to these standards. To sum up, a CAMA adheres to the requirements of FAIR data by making research results findable, complete datasets accessible, ensuring interoperability of data and analysis scripts, and thus, making data reusable (Wu et al. 2019).
3 A platform for meta-analyses in organizational psychology
3.1 PsychOpen CAMA as platform for meta-analyses in psychology
A tool that serves the psychological research community by encompassing meta-analyses of different study types and in different areas of psychology is PsychOpen CAMA. The service renders meta-analytic findings easily accessible, re-usable, and expandable for the research community (Burgard et al. 2022). It is provided by the Leibniz Institute for Psychology (ZPID), a Public Open Science Institute for psychology.
PsychOpen CAMA serves as an open repository for meta-analytic data and provides basic analysis tools (Tsuji et al. 2014), such as typical graphical devices and multilevel meta-regressions. The basic system has already been tested by the first data providers of the current datasets (e.g., Bucher et al. 2020) and is freely available to the research community. Currently available features include a data overview with summary statistics, some graphs for data exploration, and basic meta-analytic outputs such as forest plots, funnel plots, and meta-regression. A responsive interface allows to account for dependencies in the data by using multilevel models (van den Noortgate et al. 2013), as well as examine potentially relevant moderator variables. Furthermore, advanced meta-analytic tools such as p-curve analyses and power estimates are available to draw conclusions for further study planning or about the reliability of the meta-analytic evidence.
The basic system of PsychOpen CAMA is depicted in Fig. 2. Meta-analytic data are standardized according to a template and stored in a self-managed R package including generic meta-analytic functions. These functions can understand and analyze the datasets using metadata. The functions for the meta-analytic calculations and visualizations in the self-managed R package are mainly based on the R package metafor (Viechtbauer 2010). The self-managed package is accessible and documented in a Git repository under a GPL-3.0 license: https://github.com/leibniz-psychology/PsychOpen-CAMA-R-package.
In the web application, users can choose a dataset and request an analysis output. The requests are forwarded to an OpenCPU Server (https://www.opencpu.org/) server, where the analyses are executed using the data and functions from the R package. The execution of the analyses on the OpenCPU server ensures scalability of the application, which is of particular relevance for a research infrastructure that covers a wide range of possible research areas and potentially reaches many users. The resulting outputs from the analyses are embedded in the web application and thus displayed to the user.
PsychOpen CAMA allows researchers to use data available on its website, either by replicating meta-analytic results in the application, or by downloading data for further analyses. For this purpose, the standardized datasets for PsychOpen CAMA are available under a CC-BY 4.0 license in PsychArchives (https://www.psycharchives.org/). For tracking use of PsychOpen CAMA and corresponding data, users have to adhere to the citation policy. Researchers publishing or presenting work using data in PsychOpen CAMA must cite the original publications of the corresponding datasets, as well as a publication on PsychOpen CAMA (Burgard et al. 2022).
There is a similar project in the field of management and applied psychology called metaBUS (www.metaBUS.org). The main differences between the two systems are quickly outlined in the following. MetaBUS is an open search engine based on a hierarchical taxonomy of the field and provides a database consisting of correlations between clearly defined concepts within this taxonomy (Bosco et al. 2020). This approach differs substantially from PsychOpen CAMA that is based on single meta-analyses. The effect size of interest in metaBUS is correlations. These are collected by a semi-automated matrix extraction protocol with trained coders supervising this process. For PsychOpen CAMA, it is planned to make use of crowdsourcing and synergies with other ZPID services. MetaBUS relies exclusively on trained and paid coders, as crowdsourcing efforts have not paid off yet due to the difficulty to motivate and train potential collaborators (Bosco et al. 2020). In contrast to the architecture of PsychOpen CAMA with a server and a web application, MetaBUS relies on the R Shiny for the graphical user interface (Bosco et al. 2015).
3.2 Practical example with data from organizational psychology
PsychOpen CAMA currently includes 21 datasets (March 2022). In the following, we will demonstrate the platform in detail by using one of these datasets primarily analyzed and published in a meta-analysis focusing on gender differences in the intention to start a business (Steinmetz et al. 2021) based on the theory of planned behavior (TPB; Ajzen 1991). The codebook and data are available in PsychArchives: https://doi.org/10.23668/psycharchives.5264. The outputs presented in the following are easily repeatable in the user interface of PsychOpen CAMA: https://cama.psychopen.eu/inspection/CAMA_Business.
In the original meta-analysis, correlational data from 119 reports including 129 unique samples were collected. The effect size of interest were correlations between gender and TPB variables or among TPB variables. Above that, secondary data on cultural dimensions and economic data on the respective time and place of the studies were matched to the correlational data. Multilevel Random Effects Meta-Analyses were conducted for all bivariate correlations of gender and the four TPB variables. Furthermore, a MASEM (meta-analytic structural equation model) was specified and computed. The relevance of the cultural and economic context for gender differences in the intention to start a business were assessed by regressing the respective correlations on cultural and economic variables.
The functionalities in PsychOpen CAMA do not support MASEM up to now. Therefore, the data have been restricted to correlations of TPB constructs with gender, resulting in 70 studies with 205 effect sizes. The data was standardized according to the general template of datasets for PsychOpen CAMA. The resulting CAMA dataset is described in Table 1. For illustrative reasons, it is restricted to a sparse set of variables. The IDs for the report, the study, and the outcomes are used to represent the hierarchical structure of the data. The 205 effect sizes from the dataset for PsychOpen CAMA stem from 68 reports including 70 unique studies. The reports were published between 1996 and 2019 and only 13 effect sizes are derived from reports that were not published in a peer-reviewed journal. The effect size of interest are correlations. For the meta-analytic calculations, the sample size and the variance corresponding to each effect size is also given.
The dataset contains correlations between gender and four TPB constructs, namely attitude (47 correlations), intention (65 correlations), perceived behavioral control (61 correlations), and subjective norm (32 correlations). The distribution of the correlations for each of these constructs can be displayed in PsychOpen CAMA under “Data exploration”, where the grouped violinplots (Fig. 3) can be requested. For each subgroup according to the categorical variable on the x-axis, the distribution of the effect size is depicted. The horizontal lines within each violinplot divide the outcomes into five quintiles. For example, regarding the correlations between gender and intentions it can be concluded that only the highest quintile of the correlations is positive and that about 20% of the correlations are below -0.2. The same type of output is derived when selecting the country of study conduction, the cultural cluster, or the type of sample.
Choosing continuous moderator variables results in scatterplots with the correlation on the y-axis and the moderator of interest on the x-axis. If two continuous moderators are selected, a scatterplot matrix is plotted. Figure 4 displays an example for a scatterplot matrix including the correlation between gender and a TPB construct, publication year, and mean age of the sample. The association between the correlation and publication year is positive, suggesting that more recent studies report more gender equality concerning the TPB constructs. Mean age of the sample and correlations are negatively related, meaning that older samples provide less egalitarian outcomes.
In PsychOpen CAMA, multilevel random effects meta-analyses on the correlations can be conducted. The interface also allows to select up to two moderators for a meta-regression model. In Fig. 5, the results of a meta-regression model, in which the correlations were regressed on which TPB construct they concerned as well as publication year, are depicted. In the multilevel model, the variation attributed to each analysis level is estimated. Thus, the output reports the estimated variance and the corresponding standard error between the 70 studies and within the studies. The test for residual heterogeneity tests the null hypothesis, that the underlying true effect size parameters are the same in all studies included, meaning that the variation between the effect sizes is only due to sampling variance. The statistical significance of the test statistic Q means that the null hypothesis is rejected and statistical heterogeneity is expected.
The model results provide the meta-analytic estimates. For the intercept, this is the estimated weighted mean of the correlation between attitudes and gender for the mean publication year. As gender was coded 0 for males and 1 for females, these results indicate a lower attitude towards starting a business for females. The estimates for the other three TPB constructs differ significantly from the estimate for attitudes. Furthermore, a more recent publication year increases the correlation implying that females become more inclined to start a business over time.
Next to the model results, basic meta-analytic graphical displays are provided in PsychOpen CAMA. One of these, the contour-enhanced funnel plot, is depicted in Fig. 6. It considers the statistical significance of the outcomes for the evaluation of potential publication bias. The contour-enhanced funnel is centered at 0 and the differently colored regions indicate levels of statistical significance. Findings within the white region are not significant and thus, an asymmetry in this region would indicate potential publication bias, as small studies with non-significant results are expected to remain unpublished (Peters et al. 2008). The funnel plot in Fig. 6 does not provide evidence for a publication bias.
Whereas in the original study, a meta-analytic structural equation model (MASEM; Cheung 2015) was used to examine the specific effects of attitudes, subjective norms, and perceived behavioral control on the intention to start a business, this analysis cannot be represented in PsychOpen CAMA. The data structure and analysis tools of PsychOpen CAMA do not enable the replication of a MASEM in the user interface. A web application for one-stage MASEM already exists: webMASEM (Jak et al. 2021). To implement functionalities for the application of MASEM in PsychOpen CAMA, specific data templates and the inclusion of specific analysis function in the R package of PsychOpen CAMA would be needed.
Despite methodological limitations, PsychOpen CAMA provides features and opportunities that go beyond a printed article. There is a study planning tool providing a power plot for a hypothetical further study presuming the meta-analytic estimate as the true underlying effect size. It allows to estimate the necessary sample size for a desired level of statistical power. The implementation and publication of the meta-analytic data also facilitates further extension of the dataset. It can be downloaded from PsychArchives, and using the corresponding codebook, new data can be added to the existing dataset and resubmitted to PsychArchives.
4 Benefits and limitations of PsychOpen CAMA
To conclude, PsychOpen CAMA provides a platform for psychological meta-analyses to make data and analyses easily accessible, re-usable, and expandable. It adheres to the FAIR data principles improving the potential for collaborative evidence collection and the usability and accessibility of information for decision-makers and the public. As such, it can also serve as a resource to foster the use of meta-analytic evidence in organizational psychology.
The practical example from organizational psychology demonstrates the usefulness of publishing meta-analytic data in PsychOpen CAMA to make it accessible and usable. However, PsychOpen CAMA is not yet suitable for advanced meta-analytic methods. Therefore, the MASEM analyses from the original study could not be replicated with PsychOpen CAMA. Further methodological extensions, such as network meta-analyses (Nikolakopoulou et al. 2018) or the use of available individual level data within meta-analyses (Pigott et al. 2012) would be desirable in the future.
Another limitation of PsychOpen CAMA is the automation of data collection and extraction to extend meta-analytic evidence. The continuous maintenance of the data repository is labor-intensive. Crowdsourcing could be a solution (McCarthy and Chartier 2017). Yet, it depends on the willingness of the research community to provide relevant data in the desired format. The goal is to support users in the submission of data and automatize repetitive processes as far as possible. However, at least for the monitoring of these processes, plausibility checks, and necessary corrections in case of erroneous entries, manual effort cannot fully be replaced. The long-term goal of PsychOpen CAMA is to keep pace with the publication of scientific results, at least concerning some domains and hot topics in psychology.
Availability of data and material
Data are available via PsychArchives (https://doi.org/10.23668/psycharchives.5264) and analyses are replicable in PsychOpen CAMA (https://cama.psychopen.eu/inspection/CAMA_Business).
Aguinis H, Pierce CA, Bosco FA, Dalton DR, Dalton CM (2011) Debunking myths and urban legends about meta-analysis. Organ Res Methods 14(2):306–331. https://doi.org/10.1177/1094428110375720
Ajzen I (1991) The theory of planned behavior. Organ Behav Hum Decis Process 50(2):179–211. https://doi.org/10.1016/0749-5978(91)90020-T
Aytug ZG, Rothstein HR, Zhou W, Kern MC (2012) Revealed or concealed? Transparency of procedures, decisions, and judgment calls in meta-analyses. Organ Res Methods 15(1):103–133. https://doi.org/10.1177/1094428111403495
Bastian H, Glasziou P, Chalmers I (2010) Seventy-five trials and eleven systematic reviews a day: how will we ever keep up? PLoS Med 7(9):e1000326. https://doi.org/10.1371/journal.pmed.1000326
Bergh DD, Sharp BM, Aguinis H, Li M (2017) Is there a credibility crisis in strategic management research? Evidence on the reproducibility of study findings. Strateg Organ 15(3):423–436. https://doi.org/10.1177/1476127017701076
Bergmann C, Tsuji S, Piccinini PE, Lewis ML, Braginsky M, Frank MC, Cristia A (2018) Promoting replicability in developmental research through meta-analyses: insights from language acquisition research. Child Dev 89(6):1996–2009. https://doi.org/10.1111/cdev.13079
Block J, Hansen C, Steinmetz H (2022) Are family firms doing more innovation output with less innovation input? A replication and extension. Entrepren Theory Pract.
Bosco F, Steel P, Oswald F, Uggerslev K, Field J (2015) Cloud-based meta-analysis to bridge science and practice: welcome to metaBUS. Person Assess Decis. https://doi.org/10.25035/pad.2015.002
Bosco FA, Field JG, Larsen KR, Chang Y, Uggerslev KL (2020) Advancing meta-analysis with knowledge-management platforms: using metaBUS in psychology. Adv Methods Pract Psychol Sci 3(1):124–137. https://doi.org/10.1177/2515245919882693
Braver SL, Thoemmes FJ, Rosenthal R (2014) Continuously cumulating meta-analysis and replicability. Perspect Psychol Sci 9(3):333–342. https://doi.org/10.1177/1745691614529796
Briner RB, Rousseau DM (2011) Evidence-based I-O psychology: not there yet. Ind Organ Psychol Perspect Sci Pract 4(1):3–22. https://doi.org/10.1111/j.1754-9434.2010.01287.x
Bucher L, Tran US, Prinz GM, Burgard T, Bosnjak M, Voracek M (2020) Keeping meta-analyses alive and well: using PsychOpenCAMA to implement a community-augmented meta-analysis on the Dark Triad of personality. Leibniz Institut für Psychologische Information und Dokumentation (ZPID). https://doi.org/10.23668/PSYCHARCHIVES.2752
Burgard T, Bosnjak M, Studtrucker R (2022) PsychOpen CAMA: Publication of community-augmented meta-analyses in psychology. Res Synth Methods 13(1):134–143. https://doi.org/10.1002/jrsm.1536
Cheung M (2015) Meta-analysis. A structural equation modeling approach. John Wiley & Sons, The Atrium
Chang W, Franke G, Butler T, Musgrove C, Ellinger A (2014) Differential mediating effects of radical and incremental innovation on market orientation-performance relationship: a meta-analysis. J Market Theory Pract 22(3):235–250. https://doi.org/10.2753/MTP1069-6679220301
Clarke M, Brice A, Chalmers I (2014) Accumulating research: a systematic account of how cumulative meta-analyses would have provided knowledge, improved health, reduced harm and saved resources. PLoS One 9(7):e102670. https://doi.org/10.1371/journal.pone.0102670
Combs J, Hall A, Ketchen D (2006) How much do high-performance work practices matter? A meta-analysis of their effects on organizational performance. Pers Psychol 59:501–528. https://doi.org/10.1111/j.1744-6570.2006.00045.x
Créquit P, Trinquart L, Yavchitz A, Ravaud P (2016) Wasted research when systematic reviews fail to provide a complete and up-to-date evidence synthesis: the example of lung cancer. BMC Med. https://doi.org/10.1186/s12916-016-0555-0
Garner P, Hopewell S, Chandler J, MacLehose H, Akl EA, Beyene J, Chang S et al (2016) When and how to update systematic reviews: consensus and checklist. BMJ 354:i3507. https://doi.org/10.1136/bmj.i3507
Geyskens I, Krishnan R, Steenkamp J-BEM, Cunha PV (2009) A review and evaluation of meta-analysis practices in management research. J Manag 35(2):393–419. https://doi.org/10.1177/0149206308328501
González Morales L, Orrell T (2018) Data interoperability: a practitioner’s guide to joining up data in the development sector. Retrieved from: http://docplayer.net/100317533-Data-interoperability-a-practitioner-s-guide-to-joining-up-data-in-the-development-sector.html
Haddaway NR (2018) Open synthesis: On the need for evidence synthesis to embrace open science. Environmental Evidence 7(1):4–8. https://doi.org/10.1186/s13750-018-0140-4
Hansen C, Steinmetz H, Block J (2022) How to conduct a meta-analysis in eight steps: a practical guide. Manag Rev Quart 72:1–19. https://doi.org/10.1007/s11301-021-00247-4
Jak S, Li H, Kolbe L, de Jonge H, Cheung MW (2021) Meta-analytic structural equation modeling made easy: a tutorial and web application for one-stage MASEM. Res Synth Methods. https://doi.org/10.1002/jrsm.1498
Kraker P, Leony D, Reinhardt W, Beham G (2011) The case for an open science in technology enhanced learning. Int J Technol Enhan Learn 3(6):643–654. https://doi.org/10.1504/IJTEL.2011.045454
Lakens D, Hilgard J, Staaks J (2016) On the reproducibility of meta-analyses: six practical recommendations. BMC Psychol 4(1):1–10. https://doi.org/10.1186/s40359-016-0126-3
Langendam MW, Akl EA, Dahm P, Glasziou P, Guyatt G, Schünemann HJ (2013) Assessing and presenting summaries of evidence in cochrane reviews. Syst Rev 2:81. https://doi.org/10.1186/2046-4053-2-81
Le H, Oh I, Shaffer J, Schmidt F (2007) Implications of methodological advances for the practice of personnel selection: how practitioners benefit from meta-analysis. Acad Manag Perspect 21(3): 6–15. Retrieved July 28, 2021, from http://www.jstor.org/stable/27747386
McCarthy RJ, Chartier CR (2017) Collections: using “crowdsourcing” within psychological research. Collabra Psychol 3(1):26
Nicholson N (1998) Seven deadly syndromes of management and organization: the view from evolutionary psychology. Manag Decis Econ 19: 411–426. https://www.jstor.org/stable/3108122
Nikolakopoulou A, Mavridis D, Egger M, Salanti G (2018) Continuously updated network meta-analysis and statistical monitoring for timely decision-making. Stat Methods Med Res 27(5):1312–1330. https://doi.org/10.1177/0962280216659896
Nilsson M (2010) From interoperability to harmonization in metadata standardization. designing an evolvable framework for metadata harmonization. Doctoral Thesis, Stockholm, Sweden. Retrieved from: https://www.diva-portal.org/smash/get/diva2:369527/FULLTEXT02.pdf.
Peters JL, Sutton AJ, Jones DR, Abrams KR, Rushton L (2008) Contour-enhanced meta-analysis funnel plots help distinguish publication bias from other causes of asymmetry. J Clin Epidemiol 61(10):991–996. https://doi.org/10.1016/j.jclinepi.2007.11.010
Pigott T, Williams R, Polanin J (2012) Combining individual participant and aggregated data in a meta-analysis with correlational studies. Res Syn Methods 3: 257–268. https://psycnet.apa.org/doi/10.1002/jrsm.1051
Rousseau DM (2006) Is there such a thing as “evidence based management”? Acad Manag Rev 31:256–269. https://doi.org/10.5465/amr.2006.20208679
Schalken N, Rietbergen C (2017) The reporting quality of systematic reviews and meta-analyses in industrial and organizational psychology: a systematic review. Front Psychol. https://doi.org/10.3389/fpsyg.2017.01395
Schaper N (2004) Theoretical substantiation of human resource management from the perspectives of work and organisational psychology. In: Management Revue, Vol. 15 (2), Special Issue: Theoretical Perspectives for Human Resource Management: The German Discussion, 192–200. https://www.jstor.org/stable/41783464
Schultes E, Wittenburg P (2019) FAIR principles and digital objects: accelerating convergence on a data infrastructure. In: Manolopoulos Y, Stupnikov S (eds) Data analytics and management in data intensive domains. DAMDID/RCDL 2018. Communications in Computer and Information Science, 1003. Springer, Cham. https://doi.org/10.1007/978-3-030-23584-0_1
Shojania KG, Sampson M, Ansari MT, Ji J, Doucette S, Moher D (2007) How quickly do systematic reviews go out of date? A survival analysis. Ann Intern Med 147(4):224–233. https://doi.org/10.7326/0003-4819-147-4-200708210-00179
Steinmetz H, Isidor R, Bauer C (2021) Gender differences in the intention to start a business: an updated and extended meta-analysis. Zeitschrift Für Psychologie 229(1):70–84. https://doi.org/10.1027/2151-2604/a000435
Stewart W, Roth P (2001) Risk propensity differences between entrepreneurs and managers: a meta-analytic review. J Appl Psychol 86(1): 145–153. https://psycnet.apa.org/doi/10.1037/0021-9010.86.1.145
Tsuji S, Bergmann C, Cristia A (2014) Community-augmented meta-analyses: toward cumulative data assessment. Perspect Psychol Sci 9(6):661–665. https://doi.org/10.1177/1745691614552498
Van den Noortgate W, López-López JA, Marín-Martínez F, Sánchez-Meca J (2013) Three-level meta-analysis of dependent effect sizes. Behav Res Methods 45(2):576–594. https://doi.org/10.3758/s13428-012-0261-6
Viechtbauer W (2010) Conducting meta-analyses in R with the metafor package. J Stat Softw 36(3):1–48. https://doi.org/10.18637/jss.v036.i03
Wu M, Psomopoulos F, Khalsa SJ, de Waard A (2019) Data discovery paradigms: user requirements and recommendations for data repositories. Data Sci J 18(3):1–13. https://doi.org/10.5334/dsj-2019-003
Zhao H, Seibert S (2006) The big five personality dimensions and entrepreneurial status: a meta-analytic review. J Appl Psychol 91: 259–271. https://psycnet.apa.org/doi/10.1037/0021-9010.91.2.259
Open Access funding enabled and organized by Projekt DEAL.
Conflict of interest
The authors report no conflict of interest.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visithttp://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Burgard, T., Steinmetz, H. Evidence in management science related to psychology: benefits, tools, and an example of a community-augmented meta-analysis. Manag Rev Q (2022). https://doi.org/10.1007/s11301-022-00270-z
- Cumulative evidence
- Organizational psychology
- Open science
- Evidence-based management