Commentary

At least 50% of the published studies –even in the most prestigious journals- cannot be reproduced [18]. Designing, writing, reviewing, publishing, and referring to data obtained under misleading experimental conditions is clearly an expensive and unproductive procedure for all the actors involved in the scientific system. Moreover, the accumulated errors are amplified by each secondary publication that was based on non-reproducible data.

Minimal Information (MI) checklists have been proposed for standardization of experimental description but a general drawback of these platforms is that they have been primarily conceived for simplifying the bioinformatics (re)use of experimental data. This effort is meaningful because metadata analysis of standardized datasets represents a valuable source of information and maximizes the usage of already existing results [911]. Nevertheless, annotation following mandatory guidelines is often cumbersome and conflicting MI checklists have been proposed, despite the simplification efforts made by the community [1214]. Moreover, only a few cases of MI checklists focus on methodologies for recombinant protein production and quality evaluation [15].

Protein production at lab scale is a straight-forward procedure. Nevertheless, each step implies making choices, providing controls, and dealing with the evident as well as the unappreciated pitfalls of the technology, such as changes in protein expression, physical and chemical alterations in protein structure, aggregation, and proteolysis. Since protein production is very often not the aim of most research projects, but simply the way to obtain intermediate reagents to start a research project, poor protein quality will undermine the robustness of complex multidisciplinary efforts. At the same time, general (cell) biologists are less aware of protein quality than specialists such as crystallographers, enzymologists, or protein chemists and biotechnologists. Therefore, we wish to propose a methodology for improving the qualitative evaluation of their proteins to researchers who are not “protein production specialists”.

Based on many years of experience in the protein production field, we would like to propose a practice that should simplify the assessment of the experimental set based on a flowchart for initial evaluation of experimental steps in protein production together with the corresponding data to append as Additional file 1 according to the guidelines of established initiatives such as Biosharing/MIBBI Foundry.

The protein production flowchart (Additional file 1) should help following the design of the protein production protocol outlining the critical points and to standardize and reproduce the results in other laboratories [16, 17]. We suggest editors and reviewers to encourage (not compel) researches to fill as many as possible of the listed requests (following the already available standards) to acquire the necessary information for the reliable evaluation of the proposed work. Clearly, the set of relevant data will change according to the final use of the protein and, therefore, there is no reason for mandatory universal guidelines (Table 1).

Table 1 The most basic requirements for evaluating protein quality

Some editors might even consider attaching a special section of comments to the electronic version of the paper, allowing peers to grade the quality of the described protein production procedure, similar to sites such as TripAdvisor and others.