The joint meeting of the International Primatological Society and the American Society of Primatologists in Chicago 2016 provided an opportunity to discuss and update the policies of the International Journal of Primatology, the official journal of the International Primatological Society. As a result, we have made several changes and clarifications to journal policy. Most of these are to improve transparency. Scientific progress requires transparency and open communication among scientists. However, an emphasis on innovation combined with insufficient and selective reporting of methods and results impede progress. In this editorial we clarify our policies on replication, reproducibility, null results, statistical reporting, and methods validation. We have updated the Instructions for Authors and introduced badges to acknowledge open science. We also take this opportunity to summarize other changes to the International Journal of Primatology.

Replication

It is the policy of the International Journal of Primatology to encourage sound research that addresses new questions and ideas in primatology. We also encourage studies that are designed to assess the validity and generality of previously reported empirical studies. In other words, we encourage replication studies (new tests of existing ideas: Endler 2015) when new, robust datasets are available for a particular species or question for which limited existing data are available.

Replication of findings is an essential component of scientific progress. Replication and quantitative synthesis (e.g., meta-analysis) allow researchers to assess the validity of findings from individual studies and to probe their generality. Replication can be exact, partial, or conceptual (Kelly 2006). Exact replication aims to duplicate an earlier study completely, and can never be attained perfectly. This is particularly the case for primatology, where many studies concern individual study subjects, groups, or populations living in a particular location during a particular period. Close replication, however, is possible, e.g., by using the same methods at the same site but at a different time, or by partitioning long-term data sets (Nakagawa and Parker 2015). Partial replications lie on a continuum from close replication to studies with some methodological differences. Conceptual replication involves a distinctly different study, with very different methods, evaluating the same hypothesis. Partial and conceptual replications may also take the form of quasi-replications, which expand the scope of study to a new species or system (Nakagawa and Parker 2015; Palmer 2000) and allow us to understand the generalizability of findings across species, and to examine factors underlying species differences. The different levels of replication involve a trade-off between testing the validity of findings and the scope of generality (Nakagawa and Parker 2015). We encourage all forms of replication, as well as meta-analyses to synthesise findings.

A policy of encouraging replication raises the question of when a finding becomes common knowledge. For example, this might include confirmation of the diet or activity budget of a well-described species. In our view, this occurs with observations that are basic descriptions of the natural history of a well-studied species and have previously been made by dozens of researchers. Descriptive papers reporting such characteristics may still be valuable for little-known species, but may not be sufficiently novel for publication in the International Journal of Primatology if reported for well-described species.

Reproducibility

It is the policy of the International Journal of Primatology to strongly encourage public archiving of all data required to repeat the analyses presented. These are the data required to support the claims made in the publication, not the entire data set, although there are also good reasons to archive additional data (see Caetano and Aisenberg 2014).

It is the policy of the International Journal of Primatology to strongly encourage public archiving of analysis code if statistics were not run in widely available statistics programs.

It is the policy of the International Journal of Primatology to require that users of archived data cite both the data and the original publication. In addition, users must ensure that they read the data correctly, know the strengths and weaknesses of the data set, and have details of all methods and data management required for a meaningful interpretation of new analyses or reanalyses. While the ideal of data archiving is that other researchers should be able to understand the data as it has been archived, users should contact the original authors about their planned analyses and ask for their advice.

Sharing the evidence for claims, including data, details of methods, and computer code is good scientific practice and facilitates evaluation, interpretation, critique, extension, synthesis, and application (Borries et al. 2016). We aim to promote a data-sharing culture and to move our field toward mandatory data archiving. Data archiving in a public repository is required by many journals in ecology and evolutionary biology, among other disciplines (see the Joint Data Archiving Policy: http://datadryad.org/pages/jdap). It is not yet a requirement of any primatological journal. Data not stored in a repository are lost rapidly (Vines et al. 2013). This is particularly problematic for primatology, given the difficulties involved in replication, and the threats to our study species, which make some data truly irreplaceable. Data repositories are more reliable for archiving than providing data as supplementary information.

Sharing data brings benefits to the individual researcher, as well as to the scientific community (Caetano and Aisenberg 2014). For example, data archiving promotes careful and efficient data organization, and provides a stable backup. Sharing data also facilitates collaboration and promotes discoverability of research. Researchers are often concerned that other researchers may perform and publish an analysis before the original authors. However, embargo periods alleviate such concerns. Moreover, reviewers and editors can detect the use of data without permission or attribution.

Examples of public, open-access repositories include the Open Science Framework (https://osf.io/) and the various Dataverse networks. http://re3data.org/and https://biosharing.org/list numerous other qualifying data and materials repositories.

We expect authors to attend to the FAIR principles when archiving data:

  • Data should be Findable.

  • Data should be Accessible.

  • Data should be Interoperable.

  • Data should be Reusable.

See https://www.force11.org/group/fairgroup/fairprinciples for details.

Authors submitting a manuscript to the International Journal of Primatology must indicate whether they will make their data available to other researchers in the “Data Availability” section of the manuscript. There are circumstances in which it is not possible or advisable to share any or all data and materials publicly, including human participants’ data or the location of Endangered species. Authors may include an explanation of such circumstances in their manuscript.

Our data availability policies align with Research Data Policy Type 3 (for life sciences) of our publisher Springer Nature’s standardized research data policies. Type 3 journals encourage data sharing and require statements of data availability (see go.nature.com/2by6l6x).

Null Results

It is the policy of the International Journal of Primatology to publish scientifically rigorous research, including research that does not reject the null hypothesis (often termed “negative” results), provided the analyses include adequate reporting of effect sizes and an a priori power analysis (Johnson et al. 2015).

The absence of an effect, or the existence of only a weak effect, is as important to our understanding of a phenomenon as a strong or statistically significant effect. Publishing a biased subset of results obscures our view of the true effect. Scientists have a responsibility to publish their results, to avoid the “file drawer effect,” which describes a tendency to publish statistically significant results, but not to publish null results, and resultant publication bias (Møller and Jennions 2001; Rosenberg 2005).

Comprehensive Statistical Reporting

It is the policy of the International Journal of Primatology to require comprehensive details of data selection, data manipulation, and all data analyses conducted as part of a study, such that analyses can be reproduced, replicated, and fully understood. Authors should report full outcomes from all statistical analyses in the results, including alternative tests of the same hypothesis and all covariates tested.

It is the policy of the International Journal of Primatology to require numerical or graphical summaries of data; to show the full distribution of the data, rather than summary statistics for small sample sizes; and to report effect sizes (means, slopes of regressions, correlation coefficients, Cohen’s d, odds ratios, etc.) in addition to the statistical significance of analyses.

Results reported in scientific articles are often a biased subset of the results of the statistical analysis conducted for a study (Parker et al. 2016). The selected results may be those that are statistically significant in null hypothesis testing, consistent with a favored hypothesis, or surprising. Selective analysis and reporting conceals the number of comparisons made, and therefore the likelihood of a false positive (type I error), a practice that has been termed “P-hacking” (Head et al. 2015; Simonsohn et al. 2014). Linked to this is the practice of first exploring the data, then building an article around the strongest results, leading to hidden “researcher degrees of freedom” (Simmons et al. 2011) and “hypothesizing after the results are known,” or HARKing (Kerr 1998).

Full transparency requires thorough reporting of how data were treated and analyzed (archiving analysis code addresses this) and full reporting of results. The relevant information differs by analysis, but for most analyses, this includes, but is not limited to, basic parameter estimates of central tendency (e.g., means) or other basic estimates (regression coefficients, correlation) and variability (e.g., standard deviation) or associated estimates of uncertainty (e.g., confidence/credible intervals). In the case of model building, model selection, and multimodel inference, authors should report the results of all models. Authors employing model selection should not interpret their results as support for or evidence against a hypothesis, but as generating predictions to be tested in future studies. Where hypotheses were formulated after data analysis, this should be acknowledged. The line between preplanned and post hoc analyses can be blurred in long-term studies, when existing data are analyzed to test new predictions in a planned fashion. We consider this preplanned analysis. However, if that analysis leads to further hypotheses, which are then tested with the same data, this would be post hoc analysis.

Primatology and related fields are often characterized by small sample sizes (Garamszegi 2015; Taborsky 2010) because of limited availability of study subjects, observation conditions, and field logistics, or the conservation status of a study species. This limits our ability to detect patterns and estimate parameters reliably (Garamszegi 2015). Small sample sizes yield low statistical power and increase the rate of false negatives (type II error) in a null hypothesis testing framework. In other words, biologically important effects may not be statistically significant when sample sizes are small. Reporting effect sizes separates the strength of the biological effect from whether the findings are likely to be due to chance. Primatologists should move toward the use of effect sizes rather than just null hypothesis testing, and report parameters with the associated confidence intervals, rather than binary decisions based on null hypothesis testing.

In addition to increasing the risk false negatives, underpowered studies also increase the risk of false positives (type I error) (Parker et al. 2016), make it impossible to control for confounding variables, and mean that single data points can be highly influential. We encourage authors to conduct randomization and simulation-based analyses to examine the stability of results and the influence of single data points (Garamszegi 2015). We also encourage the use of Bayesian inference in addition to, and as an alternative to, traditional hypothesis testing (Congdon 2016).

Many data sets in primatology are pseudo-replicated; for example, they may include multiple observations of the same individual, violating the assumption of independence of data that underlies many statistical approaches. Traditionally, researchers addressed this by calculating an average for each individual, discarding all intraindividual variation and losing a great deal of information about individual plasticity. However, various approaches allow researchers to use the entire data set. These include randomization procedures, spatial or temporal autocorrelation models, and mixed models that allow parallel investigation of intra- and interindividual levels of variation and hierarchical levels of organization (Janson 2012). The latter, in particular, have revolutionized the analysis of animal behavior. Nevertheless such models come with assumptions, and authors must explore their data and ensure that these assumptions are not violated and include this information in their description of methods.

Methods Validation

It is the policy of the International Journal of Primatology to require authors to report validation for field and laboratory methods, or to indicate the location of such published information.

Noninvasive methods, in particular, require rigorous validation to ensure that a proxy variable predicts the target variable reliably (i.e., that the two are strongly correlated) and that measures are repeatable (i.e., that measurement error is acceptable). Measurement error itself does not introduce bias, but can lead to type II statistical error (failing to reject a null hypothesis) and to inaccurate parameter estimates (see Garamszegi 2015 for further details).

Updates to the Instructions for Authors

To further promote transparency and reproducibility, and to ensure that authors comply with journal policy, including the policies reported in this editorial, we have added a checklist to the Instructions for Authors (Table I), and ask authors to confirm that they comply with the checklist when submitting a manuscript. The checklist is adapted from the Tools for Transparency in Ecology and Evolution (https://osf.io/g65cb/wiki/home/). The Tools facilitate the promotion of transparency by academic journals in ecology, evolutionary biology, or other fields. The questions rest primarily within the Transparency and Openness Promotion (TOP) framework (https://cos.io/top/), designed for use across empirical disciplines.

Table I Author checklist for transparency in empirical studies

Many of the questions in Table I are standard components of a manuscript, but some require information that does not always appear in a manuscript (e.g., justifications for the sample size, selection of subjects, and study duration in a field study). We have also added this checklist to the reviewer guidelines and we (the editors) will check these points to assess author compliance.

Badges to Acknowledge Open Science

To further encourage transparency, the International Journal of Primatology provides an incentive for researchers to share the data and materials underlying their articles by acknowledging open practices with Open Science Framework badges in publications (https://osf.io/tvyxz/wiki/home/). Badges indicate that the paper conforms to specific transparency standards and are displayed on the first page of a paper. Badges are effective in promoting data archiving (Kidwell et al. 2016). For complete details, consult https://osf.io/tvyxz/wiki/1.%20View%20the%20Badges/.

The International Journal of Primatology awards badges for

  1. 1.

    Open Data. The Open Data badge is earned for making the digitally shareable data necessary to reproduce the reported results publicly available.

  2. 2.

    Open Materials. The Open Materials badge is earned by making publicly available the components of the research methodology needed to reproduce the reported procedure and analysis.

Authors may apply for one or both badges (Fig. 1) when submitting the final version of their manuscript. For each badge selected, authors complete the disclosure items. They are checked by the editor, but accountability remains with the author. If authors cannot meet badge criteria, they may provide text to appear in the manuscript such as “We will grant all reasonable data requests from qualified researchers.”

Fig. 1
figure 1

Badges awarded for Open Data and Open Materials in the International Journal of Primatology. Badges indicate that the paper conforms to specific transparency standards and are displayed on the first page of a paper. For more details, see Open Science Framework badges in publications (https://osf.io/tvyxz/wiki/home/).

If the application for a badge or badges are accepted, then the Disclosures and badges are printed in the journal article.

Other Changes to the Journal

Submission Categories

In addition to original research articles, which can include short articles, the International Journal of Primatology considers review articles and book reviews for publication. To these, we have added the category of News & Views pieces. News & Views are either short communications reporting new brief observations or results or critical commentaries on recently published papers in the International Journal of Primatology or other journals. These are limited to 1000 words and 5 references, with a maximum of one figure or table and no abstract. Short communications should have important implications for our understanding of primates and have theoretical significance beyond the species involved.

We continue to welcome proposals for Special Issues or Special Sections on a particular theme. A Special Issue is one whole issue of the journal and should include approximately 12–14 articles. A Special Section is a smaller collection of articles. Articles in a Special Issue or Section can include original research articles, reviews, commentaries, and guest editorials. The Editor-in-Chief provides full support to Guest Editors, and advice as needed. To propose a Special Issue, please send the following information to the Editor-in-Chief, Jo Setchell (joanna.setchell@durham.ac.uk):

  1. 1.

    A proposed title

  2. 2.

    Proposed Guest Editors

  3. 3.

    A 250-word abstract that explains why the topic is important

  4. 4.

    A list of the intended contributions

  5. 5.

    An estimated timeline for submissions

Reviewers

Conscientious peer review is a time-consuming task, but is essential to ensure the quality of scientific research. The International Journal of Primatology is very grateful to reviewers for the time and effort they invest in the review process. We have initiated a new process to offer our thanks more explicitly. We will publish a list of reviewer names with our thanks in the last issue of the journal published before each IPS congress (a 2-year cycle). We will also announce in that issue, and at the following IPS congress, an award for the best reviewer, which will be presented at the congress. This award will be determined on the basis of confidential editor (Editor-in-Chief, Associate Editor, and Guest Editor) scores of the usefulness of all provided reviews.

Associate Editors and the Editorial Board

We are in the process of adding a new Associate Editor, as well as revising the Editorial Board. The Editorial Board serves to broaden the scope and range of expertise beyond that of the editors. It also reflects the international nature of the journal. Members of the Editorial Board are listed on the journal homepage and in the printed issues. The members act as ambassadors for the journal, support and promote the journal, seek out the best work and actively encourage submissions, and review submissions on a more regular basis than other reviewers.

Authorship

It is the policy of the International Journal of Primatology to encourage authors to give details of their contributions in the acknowledgments. The Instructions for Authors refer to the Committee on Publication Ethics (COPE: http://publicationethics.org/) guidelines on authorship.

Conclusion

The policies we outline here are intended to facilitate the scientific process by promoting transparency, openness, and reproducibility in primatology. In implementing these policies, we join editors of other journals in promoting a more open research culture (Nosek et al. 2015). The International Journal of Primatology is the official journal of the International Primatological Society, and we encourage submissions from all areas of primatology, including, but not limited to anthropology, anatomy, ethology, paleontology, psychology, sociology, and zoology. If you are interested in submitting your work and have questions, please send an e-mail inquiry to any of the editors.