Abstract
Open Science is a phenomenon that can be traced back to the Middle Ages. In the end of the twentieth century and the beginning of the twenty-first century, Open Science is strongly growing due to the worldwide internet and related new technologies, tools and communication channels. Two core objectives (reliability and trust) and three main characteristics (transparency, openness and reproducibility) of Open Science can be identified but it is still too early for a broad definition of this growing movement. Its growth is happening in many disciplines and in diverse facets. This article presents an overview of how Open Science is introduced and established in all three science dimensions of research design, processes and publications. For the future, the benefits are analysed that Open Science is offering, as well as the challenges that it is facing. It can be concluded that it is desirable that all researchers collaborate in Open Science. Open Science can improve the different science disciplines, research practices and science in general. In that way, Open Science can contribute to overcome the post-truth age through increasing objective and subjective credibility of science and research. And in the long-term perspective, Open Science can improve the whole research, education, as well as our society.
Download chapter PDF
2.1 Introduction
Open Science is claimed as radical social innovation (David, 2004a) and as a disruptive movement for Higher Education (Vicente-Saez & Martinez-Fuentes, 2018). However, what is Open Science and how can it be used to radically change and improve research, education and our society? This overview article provides an introduction to the history of Open Science followed by an analysis of the current state-of-the-art of Open Science and its characteristics.
2.2 History of Open Science
Open Science is a combination of objective and subjective goals to improve research and science in general. David underlines that Open Science is “a fragile cultural legacy of Western Europe’s history” (David, 2004a, 571) when he analyses the history of Open Science back to its appearance during the late sixteenth and early seventeenth centuries and the court patronage system of the late Renaissance Europe (David, 1998, 2004a, 2007). In the 17th century, the shift towards Open Science was visible in public life and first science periodicals were published (Kronick, 1976). The first science association was established: the Royal Society of London for Improving of Natural Knowledge, founded in 1660 with a strong focus on openness and including women indicating this shift, too (Willinsky, 2005). The basis for this appearance of Open Science was a change in the opportunities, needs and demands by western European feudalism and the fragmented and competing noble patrons (David, 1998, 2004a, 2007). The increasing importance of mathematics for many disciplines and the spread of printing were main factors for the Renaissance and the scientific revolution. Scientists were no longer interested in keeping the secrets of the nature in small circles (like the alchemy continued to do) but to publish their scientific news: According to the analysis of David (1998, 2004a, 2007), the “common agency contracting in substitutes” (David 2004a, 582) by the noble patrons as the political authorities led to the competition among the scientists. Innovations in technologies remained hidden to obtain economical or military advantages so that only new scientific results could be published and presented. Noble patrons engaged scientists to gain reputation and scientists were interested to present their results in public to achieve better contracts as most of them could not live from one single contract. That constituted also the major progress that fragmented Western Europe could gain in comparison with bigger monolithic politic systems such as the Heavenly Empire of China with similar conditions but failing to introduce this successful concept of Open Science (David, 1998, 2004a, 2007).
Science, in general, was evolving and increasing, also in the number of disciplines leading to specialized research, scientific communities and theories in the twentieth century. How changes in science and scientific theories are taking place is under controversial debate with three main representatives: Popper (1959) believes in the progressive and cumulative increase of scientific knowledge, while Kuhn (1962) postulates potential paradigm shifts in addition, whereas Feyerabend (1978) neglects progressive and cumulative increase of scientific knowledge, as well as the existence of universal methodological rules. In consequence, many science studies were undertaken with different labels and schools of thoughts to analyse the past and future of science. That includes the attempt to create a new discipline called the sociology of science (David, 1998, 2004a, 2007). It is often based on the sociology of science norms from “the Republic of Science” by Merton (1996, 1973) that can be summarized by the four core aspects: communism, universalism, disinterestedness and organized scepticism (also abbreviated as CUDO). Other scientists have revised them: Macfarlane and Cheng (2008) added originality and shortened organized scepticism to scepticism leading to the abbreviation CUDOS whereas Ziman (1994, 2000) has additionally changed communism to communalism. In general, the ambiguity between science as a subject for research and science as a methodology used in the research is causing confusion, at least in English leading to a variety of competing classification systems (e.g., science vs. technology studies or formal vs. natural vs. social sciences).
In the modern science of the twentieth century, Open Science is closely connected with the emergence of open source and open access according to Willinsky (2005). In the 1960s and 1970s, freedom dominated the academic world and science and any software developed by scientists and engineers in academic, as well as in corporate laboratories was freely shared, modified and re-used (von Hippel, 2005; Markoff, 2005). But only a few years later, the commercialization of Higher Education began with a large impact, among others on the publication and access to research results, too (Bok, 2003).
The open source movement started when Stallmann resigned his professorship and left MIT due to its decision for licensing any newly developed computer code leading to restrictions (Stallmann, 2005). He founded the Free Software Foundation and formed the GNU General Public License that quickly was and still is used broadly to release software as free products and code to be re-used by others. The confusion that free software means (as the GNU GPL still allows to charge for distribution or support of free software) led to the definition of the new term open source coined by Peterson in the year 1998 (according to the Oxford English Dictionary, cited by Willinsky, 2005).
Next to the commercialization of Higher Education (Czarnitzki, Grimpe & Pellens, 2015; Shibayama, 2015), new general copyright rules and laws were developed and approved inside and outside of the academic world and science (David, 2004b). They are mainly intended to protect intellectual property rights (IPR) and in particular economic interests of business and corporates. A major impact of them is the complete turnaround of the default: Before their approval, everybody could share and re-use any publication without a copyright statement. After their approval, everybody can share and re-use only publications with an explicit open license that allows sharing and re-usage. That led to confusion among researchers, as well as educators and all citizens and to several manifestos, that were initiated and published (Budapest Open Access Initiative, 2002; Berlin Declaration, 2003; Bethesda Statement, 2003). Furthermore, Creative Commons were established as an association to develop global open licenses for different purposes. Currently, six licenses are defined based upon four conditions: 1. Attribution (“by”), 2. ShareAlike (“sa”), 3. NonCommercial (“nc”) and 4. NoDerivates (“nd”).
A more narrow understanding of Open Science reallocates its origin in the emergence of the term “Science 2.0” during the first years of this century, more precisely in the later 2000s (Mirowski, 2018). In this perspective, Open Science is a re-branding of Science 2.0 by The New York Times (Lin, 2012) and the British Royal Society (2012) in the year 2012. As a consequence, there was the appearance of many popular publications, white papers and policy documents, as well as of several institutions and initiatives promoting Open Science, mainly in Northern America including a television series (“The Crowd and the Cloud”, broadcasted by the channel PBS in the year 2017 and funded by the American National Science Foundation) dedicated to the “Open Science Price” (Mirowski, 2018).
2.3 Current State-of-the-Art of Open Science
This section will provide an overview of the current state-of-the-art of Open Science. The Digital Age fosters new ways of communication and knowledge sharing that are changing social processes and societies including science disciplines and institutions (Peters & Roberts, 2012; Stracke, 2018a, b, 2017a, b, c).
Open Science is considered as a paradigm change that is challenging traditional research to improve accuracy, trust and transparency through openness standards facilitating replications (Makel & Plucker, 2017). It leads to a change of behaviours in the publications, as well as in the research itself, what Vazire (2018) considers as a credibility revolution.
2.3.1 Definition of Open Science
Open Science is a broad field with many divergent perspectives from different stakeholders and thus, several definitions of Open Science exist (Vicente-Saez & Martinez-Fuentes, 2018). Several stakeholder groups are not aware of this situation and the lack of a common understanding and of a formal definition is identified (Arabito & Pitrelli, 2015; European Commission, 2015; Kraker, Leony, Reinhardt & Beham, 2011; OECD, 2015). Many movements of Open Science appeared in the last decades and can be differentiated in several ways (Borgman, 2007). Fecher and Friesike (2014) tried to distinguish five schools of thoughts (democratic, pragmatic, infrastructure, public and measurement) but these schools are overlapping and cannot be differentiated clearly. Thus, I agree with the summary by Fecher and Friesike: “The assumed coherence in regard to Open Science still lacks empirical research” (Fecher & Friesike, 2014, 36) and adds that the classification of Open Science requires deeper analysis, too.
Based on their literature review, Vicente-Saez and Martinez-Fuentes (2018) have clustered the collected studies into four categories to characterize Open Science: transparent, accessible, shared and collaborative-developed. The integration of the four characteristics leads Vicente-Saez and Martinez-Fuentes to the definition: “Open Science is transparent and accessible knowledge that is shared and developed through collaborative networks” (Vicente-Saez & Martinez-Fuentes, 2018, 434). However, it remains questionable whether the sole focus on knowledge is covering a broad range of perspectives. Therefore, it is maybe more promising to address the multiple objectives of Open Science and to cover them through a broad definition for a common and clear understanding shared by all stakeholder groups. And I believe that it is still too early for such a global definition of Open Science in this “turbulent yet exciting time of transition” (Arabito & Pitrelli, 2015, 2) and it could maybe even be impossible given the diverse perspectives (like it is the case with the term Open Education, see Stracke (2019) and below).
In addition, the English language, in particular, and their terms and common understandings are causing problems for Open Science and its definition based on knowledge. First, the term knowledge has specific connotations and at least two understandings in the English language as singularity as pointed out by Fecher and Friesike (2014): There is a distinction “between knowledge creation that is concerned with the rules of the natural world (science) and knowledge creation that is concerned with the human condition (humanities)” (Fecher & Friesike, 2014, 4). Second, the term science is not covering all scientific subjects and disciplines: It was contrasted with humanities for a long time until Kagan was coining the new term three cultures adding social sciences next to sciences and humanities (Kagan, 2009; Sidler, 2014). This new distinction into three scientific sections makes it even more challenging to use the term Open Science. Furthermore, I have already discussed above which problems the term science is causing for the classification of disciplines. Thus, I will continue using Open Science as an umbrella term embracing and referring to all scientific subjects and disciplines, as well as different objectives and purposes that can be objective (such as better formal reliability) and subjective (more trust in research). A broad, and therefore, vague working definition could be (and will be used in the following): Open Science is a combination of objective and subjective goals and means to improve science in the diverse subjects and disciplines and as a whole.
2.3.2 Objectives and Characteristics of Open Science
There are two major objectives of Open Science: first, higher reliability of the research findings and second, greater trust in scientific research, both objectives with the intention to overcome the fake news from the post-truth age (Cook, Lloyd, Mellor, Nosek & Therrien, 2018; Higgins, 2016). These two objectives are complementary and directly interrelated: Reliability provides objective credibility due to formal correctness and reproducibility and trust leads to subjective credibility due to individual confidence building (Fig. 2.1).
Consequently, two main characteristics of Open Science are transparency and openness that have to cover the whole research process from the design until the critical review after publication (Miguel et al., 2014; Nosek & Bar-Anan, 2012; Nosek et al., 2015). Transparency is a core value of science for credibility and is required to maximize insights, peer evaluation and evidences (Cook et al., 2018). Openness includes the sharing of research materials and data to facilitate their better understanding, verification, improvement and re-usage (Miguel et al., 2014; Molloy, 2011).
A third main characteristic of Open Science is reproducibility: Reproducibility indicates how robust and repeatable research findings are and includes the potential replicability (Goodman, Fanelli & Ioannidis, 2016; Nosek & Errington, 2017). Next to the research methods, reproducibility addresses the influence of decisions during the data analysis (e.g., on outliers and covariates) for consistent findings by different researchers, too (LeBel, McCarthy, Earp, Elson & Vanpaemel, 2018; Silberzahn et al., 2017). However, I agree with Goodman et al. (2016) that it is still unclear how reproducibility relates and leads to the development of cumulative evidence and commonly accepted truth in the research community. Overall, the research evidences from Open Science are considered more reliable and valid due to their increase of transparency, openness and reproducibility (Cook et al., 2018; Nosek et al., 2015).
Another core argument in favour of Open Science is that it enhances the trustworthiness of the research findings: Trustworthiness means to which extent research methods and data and the resulting findings can be called reliable and valid representations of the reality (Carnine, 1997; Odom et al., 2005). In this perspective, trustworthiness contributes directly to the first objective of Open Science (i.e., higher reliability and validity). And in the long-term view, it should support the second objective of Open Science (i.e., greater trust and confidence in research) by convincing both, the researchers, as well as the citizens and the whole society. This way, Open Science could play an important role to overcome fake news and to build a societal consensus and knowledge community.
2.3.3 Open Science in Scientific Research and Dimensions
Open Science is strongly growing currently and the term is used to describe many different concepts, means and practices across the whole science. Next to the commercialization of (higher) education, more problems were appearing in scientific practices and publications during the last decades (Chambers, Feredoes, Muthukumaraswamy & Etchells, 2014; Cook et al., 2018). There are also general concerns about whether science is self-correcting and that the progress of research is uneven (Shavelson & Towne, 2002).
Contrary interests of researchers against Open Science are secrecy, particularism, self-interestedness and organized dogmatism (Anderson, Ronning, DeVries & Martinson, 2007). They were first discovered by Mitroff (1974) through interviews with elite scientists from the Apollo lunar missions who conducted research in direct contradiction to the Merton’s norms. The connected problems of pressures for publications and funding acquisition are demanding for researchers and under broad discussion (Casadevall & Fang, 2012; Giner-Sorolla, 2012; Gunsalus & Robinson, 2018; Nosek, Spies & Motyl, 2012).
In addition, it is proven that researchers have great freedom to manipulate research analysis and findings to achieve the most attractive and interesting results for easy publication and best recognition (Simmons, Nelson & Simohnson, 2011; Wicherts et al., 2016). Normally, researchers do not falsify data as it would be accused as scientific misconduct but several manipulations can easily be conducted and are reported as research practices such as data fishing and p-hacking (see https://projects.fivethirtyeight.com/phacking for an interactive demonstration), hypothesizing after results are known (called HARKing), and selectively reporting analyses and publishing studies with positive results labelled as reporting and publication bias (John, Loewenstein & Prelec, 2012; Simmons et al., 2011; Cook et al., 2018).
Replication studies are not often practiced and resulting in failures for the validation of the original findings (Camerer et al., 2016; Ebersole et al., 2016; Klein et al., 2014; Open Science Collaboration, 2015). One first major replication study tried to repeat 100 studies in psychology with 97 significant findings and could validate only 36 of them (Open Science Collaboration, 2015). That does not mean that the conclusions of the other studies were false-positive but that the reproducibility is more difficult than normally estimated and may lead to more cautious statements on research results (Randall & Welser, 2018).
The general idea behind Open Science including its concept of replication is the common sharing, analysis, peer-reviewing and evaluation of research and its results. Therefore, the Open Access to the research, its design, its data, its results and its publications is very important for Open Science. There are many types of Open Access (OA) that can be differentiated according to their availabilities and costs (Piwowar et al., 2018): They range from Libre OA (reading and re-usage of articles), Gratis OA (only reading of articles), Gold OA (journals with direct OA), Green OA (journals with permission of self-archiving), Hybrid OA (OA after paying an article processing charge), Delayed OA (OA after embargo time), Academic Social Networks (online communities) to Black OA (illegal pirate sites). To give researchers (as well as any other interested parties such as educators and learning providers) a better overview of what they can do with the OA publications, licenses such as Creative Commons (see above) were developed for different purposes.
Furthermore, Open badges can support the introduction of Open Science and Open Access as reported by Kidwell et al. (2016): Psychological Science was the first journal using open badges for marking articles following principles of Open Science and the number of articles with open data has increased from 3% (the two years before adopting open badges) up to 39% (1.5 years after adopting open badges). However, it remains questionable whether this increase is caused by the badges or maybe by the general increase of open access publications.
More directly and evidently, Open Science and Open Access can benefit from public authorities and policy developers. Taxpayers, respectively, the politicians on behalf of them and funding donors are increasingly demanding for Open Access of supported and funded research results such as the European Commission and national Ministries of Education like the Dutch one. In that way, research councils can play an important role in the establishment of future policies and practices of Open Science (Lasthiotakis, Kretz & Sá, 2015).
In addition, Open Science is focusing collaborative research in different approaches: First examples of collaborative research in Open Sciences are: the Human Genome Project that was open for expert organizations, the Polymath Project that asked for contributions from experts and senior researchers and the Galaxy Zoo that all citizens could join (Fecher & Friesike, 2014). In addition, the technological progress in distributed computing led to Open Science examples such as the Open Science Grid (Fecher & Friesike, 2014).
Open Science is already discussed and introduced in many different disciplines as identified and highlighted by van der Zee and Reich (2018): There are first examples of disciplines in social sciences such as criminology (Pridemore, Makel & Plucker, 2017) and sex research (Sakaluk & Graham, 2018), whereas the discussion is just starting in humanities, see, e.g., open science education (van der Zee & Reich, 2018; Stracke, 2019), while Open Science is practiced in many disciplines of (natural and formal) sciences such as animal welfare (Wicherts, 2017), biomedicine (Page et al., 2018), climate research (Muster, 2018), energy efficiency (Huebner et al., 2017), hardware development (Dosemagen, Liboiron & Molloy, 2017), high-energy physics (Hecker, 2017), information science (Sandy et al., 2017), mass spectrometry (Schymanski & Williams, 2017), neuroscience (Poupon, Seyller & Rouleau, 2017) and robotics (Mondada, 2017).
Finally, Open Science is covering all dimensions and processes of scientific research from first idea development until the final revision and improvement of published results. To ensure such a broad focus and to constantly increase the scientific knowledge, Open Science can and should adapt and follow the philosophy of Total Quality Management (TQM) and its continuous improvement cycle (Juran, 1951, 1992; Deming, 1982, 1986; Stracke, 2011). The TQM philosophy distinguishes between the potential, the processes and the results as the three dimensions to be continuously evaluated and improved in iterative cycles (Stracke, 2006, 2014).
These three dimensions of TQM were transferred first to the health care sector by Donabedian (1980) and afterward to the education sector by Stracke (2006, 2015). They can also be adapted to science by differentiating the three dimensions of the research design, the research processes and the research publications as shown in Fig. 2.2. In the following, a short overview of the current practices of Open Science will be provided to present its variety and main focus today.
2.3.4 Openness in Scientific Design, Research and Publications
Open Science is combining and promoting different concepts, means and practices for all three science dimensions, sometimes introducing radical solutions. In the following, only a few examples can be highlighted that are consequently changing the way how science design, research and publications are realized.
First, Open Science is promoting Open Data to share data for their potential re-usage: The practices of data sharing are different across research communities and sometimes even within research communities (Borgman, 2012). Data should be open instead of free (as in “free beer” - or open, as well as free) as open data allow independent re-usage (Murray-Rust, 2008). It has to be noted that the importance and need for open data depends on the discipline (Fecher & Friesike, 2014). Thus, the question arises whether open data should be stored and accessible in general or in domain-specific repositories (Cook et al., 2018). That leads to discussions on data sharing that can be tipped only on the surface (like an iceberg):
The journal Advances in Methods and Practices in Psychological Science presents in its inaugural issue guiding principles in data sharing, primers, as well as discussion about sharing data that might contain sensitive information (Gilmore, Kennedy & Adolph, 2018; Levenstein & Lyle, 2018; Meyer, 2018). Guidelines (e.g., Inter-University Consortium for Political and Social Research, 2012) are available to assist researchers in formatting a variety of data types, including qualitative and quantitative, for their sharing. And an open and lively debate is also whether researchers or publishers are responsible (or guilty) for (not) sharing their data Murray-Rust (2008), Molloy (2011), Vision (2010), Boulton, Rawlins, Vallance and Walport (2011), Fecher and Friesike (2014). Wilkinson et al. (2016) recommend for sharing of data that researchers should follow their proposed “FAIR” principles to make data: findable, accessible, interoperable and reusable.
Second, Open Science is also requesting the public announcement and discussion of future research and its results that includes diverse options such as preregistration of research plans and questions, preprints of interim or final drafts of research results and publications and registered reports as a combination of both, the research design and the discussion of the results. Registered reports are splitting the traditional peer review of articles that are undertaken for the publication into two stages: the peer review of the research design before conducting the research and the peer review after the data analysis (Chambers et al., 2014). The Lancet was the first journal that introduced a prototype of registered reports, called “protocol review”, in the year 1997 (Horton, 1997) that was ceased after revision in the year 2015 (Chambers, 2019): The editors noted the greater importance of open access to research protocols and encouraged the authors to publish on own institutional websites for general openness (The Lancet, 2015). Registered reports were introduced first in 2013 by the journal Cortex and in parallel with a related format at Perspectives on Psychological Science (Nosek & Lakens, 2014; Chambers, 2019). The number of journals offering registered reports increased quickly up to 108 in June 2018 and 207 in October 2019 (see the current list of journals on: https://cos.io/rr). Open questions and concerns about registered reports are answered by Chambers et al. (2014).
2.4 The Future for Open Science
Open Science can be traced back to the Middle Ages and is currently growing and entering the stage in many disciplines as introduced above. That is happening, in particular, due to the opportunities that worldwide internet and new technologies, tools and communication channels are offering. In the current second decade of the twenty-first century, science is facing the beginning of a post-truth age (Higgins, 2016): This development causes a lot of issues and concerns such as fake news and general denial of sciences and scientific facts. In the following, an overview will be presented how Open Science can evolve in the future and can contribute to overcome the post-truth age.
2.4.1 Benefits of Open Science
The future of Open Science depends on the facilitated benefits and on the problems and needs of researchers that Open Science can solve. Therefore, the following overview of the benefits of Open Science focuses on its support for the individual researchers and the whole science, too.
McKiernan et al. (2016) postulate that Open Science supports researchers to succeed by offering several benefits such as higher citations and recognition what is seconded by several authors (Dorch, 2012; Henneken & Accomazzi, 2011; Piwowar, Day & Fridsma, 2007; Piwowar & Vision, 2013). Three general benefits of Open Science are reported by Allen and Mehler (2019): First, greater faith in research, second, new helpful science systems and third, investment in the own future for researchers. Nosek et al. (2015) propose eight standards with three levels for promoting Open Science and its openness in research and science: two standards for rewarding authors, four standards for the scientific process and its reproducibility and two standards for values from preregistration.
In particular, Open Science and registered reports (RRs) are offering specific benefits (Cook et al., 2018). Registered reports enable more publications of null findings: Allen and Mehler (2019) have compared registered reports against traditional studies and found that they publish more null findings. Research studies with null findings are important for the scientific progress and avoid duplications of studies but are not often reported and published or not even submitted due to reporting and publication bias, in particular, in social sciences (Fanelli, 2010; Sterling, 1959, Cook & Therrien, 2017; Therrien & Cook, 2018; Franco, Malhotra & Simonovits, 2014; Greenwald, 1975). Furthermore, journals are often not accepting null findings leading to research evidences and literature basis that are exaggerating false-positive findings or positive effects (Ferguson & Heene, 2012; Ioannidis, 2012; Munafo et al., 2017).
In addition, Open Access is a necessity and an instrument to overcome the inequities among researchers and institutions in financial positions, among countries with different levels of developments and in general among all citizens from our global society. Several authors have analysed these inequities from different perspectives (Fecher & Friesike, 2014): Phelps, Fox and Marincola (2012) highlight the role of Open Access for the development of individual researchers, as well as of the society, through the broadest possible dissemination. Rufai, Gul and Shah (2012) in their study on library and information science recommend low-income countries to adopt Open Access. Cribb and Sari (2010) have stated the difference between the creation and sharing of research results: They conclude that while every five years the scientific knowledge doubles, the access to it remains limited in most cases. I agree with them that Open Access to scientific knowledge is a human right: It has to be tackled by the researcher, research institutions, funding bodies, public authorities and the whole society.
2.4.2 Challenges for Open Science
Three general challenges of Open Science are identified by Allen and Mehler (2019) when practicing Open Science: First, the restrictions on flexibility, second, the costs of (additional) time required for Open Science and third, the lack of an incentive structure. Furthermore, there are many more challenges for Open Science at different levels from which only two examples are selected to highlight the questions to be addressed and answered by Open Science.
First and at the general level, the opportunity for replication as encouraged and requested by Open Science can also cause a constraint on generality (Simons, Shoda & Lindsay, 2017). That is a substantial and generic question that is not easy to answer. The situation in (social) science has slowly enhanced since the rhetoric questioning of replications by Schmidt (2009). On the other hand, Simmons, Nelson and Simonsohn could prove the huge flexibility of the researcher when analysing data, often leading to unpredictable and non-replicable results (Simmons et al., 2011). Furthermore, meta-research has shown that most studies are still not fulfilling the requirements and standards of Open Science including the opportunity for replication even if they are claiming to do it (Nuijten, Hartgerink, van Assen, Epskamp & Wicherts, 2016).
Second and at the analysis level, Open Science has to deal with the same challenges as traditional research that became evident with the discussion in the year 2017 about the right level for statistical significance: Benjamin et al. (2017) proposed p < 0.005 (instead of p < 0.05) as a new level for better reproducibility and more accurate communication. In direct replicas, Lakens et al. (2017) recommend to avoid the term statistical significance and the use of standardized thresholds for p-values whereas McShane, Gal, Gelman, Robert and Tackett (2017) demands to abandon the importance of null hypothesis significance testing (NHST) and of statistical significance and its levels in general. This discussion demonstrates, on the other hand, a big advantage of Open Science. Researchers can quickly read preprints and answer them in the same way leading to a strong and lively community. Thus, Open Science offers new communication channels that avoid the waiting for review processes and facilitate more and direct responses.
2.5 Open Science and Openness in Education
Our society is entering a post-truth age causing a lot of issues and concerns as introduced above. That is happening in science, as well as in education and quantifies the size of the problems. Open Science education is considered to enable a change (van der Zee & Reich, 2018; Stracke, 2019). Open Science education can mean both: First, the education for introducing Open Science as subject (i.e., theories and practices of Open Science as an educational topic) and second, Open Education adapting and following Open Science principles (i.e., learning processes as innovative (pedagogical) methodologies). For the adaptation of Open Science, nine different dimensions of openness were defined and clustered in three dimensions (visionary, operational and legal) and applied to Open Education by Stracke (2017a, b, c). In special education, Cook et al. (2018) have requested to apply Open Science and related practices to advance the quality of research, as well as the future policies and practices and McBee, Makel, Peters and Matthews (2018) called for the same action in gifted education.
In his meta-analysis of meta-analysis studies, Hattie (2008), however, has proven that most studies claiming scientific and evidence-based results could not be verified and validated. The debate on the relevance and importance of these findings has just started in educational research and community. During the last four years, many educators and community organizations have collaborated to opening up education and its research. Under the leadership of the Slovenian government, the first draft for a UNESCO Recommendation on Open Education and Open Educational Resources (2019a) was developed and discussed in open consultations. Only three weeks ago (on 25th of November 2019), the 40th General Conference of UNESCO has adopted in global consensus the UNESCO Recommendation on Open Education and Open Educational Resources (2019b). It is a milestone as it is UNESCO’s very first binding recommendation in the fields of Open Education and it requires annual progress reporting by all 193 UNESCO member states. I hope that it will be a glorious landmark and guide for the future of Open Education leading to a fundamental change in global learning and education for all.
2.6 Conclusions and Outlook
Open Science is a current phenomenon that can be traced back to the Middle Ages. Its growth is happening in many disciplines and in diverse facets in particular due to the worldwide internet and related new technologies, tools and communication channels. In the future, it is desirable that all researchers collaborate in Open Science to realize its benefits. The three main characteristics of Open Science (transparency, openness and reproducibility) strengthen science leading to the two core objectives of Open Science: more objective credibility due to increased (formal) reliability and more subjective credibility due to increased (personal) trust. In this way, Open Science can contribute and facilitate to overcome the post-truth age.
Therefore, all researchers should ensure that Open Science is introduced and established in all three science dimensions of research design, processes and publications. In consequence, Open Science can improve further development, recognition, reputation and progress of the different science disciplines, research practices and science in general. And in the long-term perspective, Open Science can and hopefully will improve the whole research, education, as well as our society.
References
Allen, C., & Mehler, D. M. A. (2019). Open science challenges, benefits and tips in early career and beyond. PLoS Biology, 17(5), e3000246. https://doi.org/10.1371/journal.pbio.3000246.
Anderson, M. S., Ronning, E. A., De Vries, R., & Martinson, B. C. (2007). The perverse effects of competition on scientists’ work and relationships. Science and Engineering Ethics, 13, 437–461. https://doi.org/10.1007/s11948-007-9042-5.
Arabito, S., & Pitrelli, N. (2015). Open science training and education: Challenges and difficulties on the researchers’ side and in public engagement. Journal of Science Communication, 14(4), C01_en, 1–4. https://doi.org/10.22323/2.14040301.
Benjamin, D. J., Berger, J., Johannesson, M., Nosek, B. A., Wagenmakers, E., Berk, R., … Johnson, V. (2017, July 22). Redefine statistical significance. PsyArXiv [pre-print]. https://doi.org/10.31234/osf.io/mky9j.
Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities. (2003). Retrieved March 1, 2019, from http://oa.mpg.de/lang/en-uk/berlin-prozess/berliner-erklarung/.
Bethesda Statement on Open Access Publishing. (2003). Retrieved March 1, 2019, from http://dash.harvard.edu/handle/1/4725199.
Bok, D. (2003). Universities in the marketplace: The commercialization of higher education. Princeton, N.J.: Princeton University Press.
Borgman, C. L. (2007). Scholarship in the digital age: Information, infrastructure, and the internet. Cambridge, MA: The MIT Press.
Borgman, C. L. (2012). The conundrum of sharing research data. Journal of the American Society for Information Science and Technology, 63(6), 1059–1078.
Boulton, G., Rawlins, M., Vallance, P., & Walport, M. (2011). Science as a public enterprise: The case for open data. The Lancet, 377(9778), 1633–1635. https://doi.org/10.1016/S0140-6736(11)60647-8.
Budapest Open Access Initiative. (2002). Retrieved March 1, 2019, from http://www.opensocietyfoundations.org/openaccess/read.
Camerer, C. F., Dreber, A., Forsell, E., Ho, T.-H., Huber, J., Johannesson, M., et al. (2016). Evaluating replicability of laboratory experiments in economics. Science, 351, 1433–1436. https://doi.org/10.1126/science.aaf0918.
Carnine, D. (1997). Bridging the research-to-practice gap. Exceptional Children, 63, 513–521. https://doi.org/10.1177/001440299706300406.
Casadevall, A., & Fang, F. C. (2012). Reforming science: Methodological and cultural reforms. Infection and Immunity, 80, 891–896. https://doi.org/10.1128/IAI.06183-11.
Chambers, C. (2019). What’s next for registered reports? Nature, 573, 187–189. https://doi.org/10.1038/d41586-019-02674-6.
Chambers, C., Feredoes, E., D. Muthukumaraswamy, S. J., & Etchells, P. (2014). Instead of “playing the game” it is time to change the rules: Registered Reports at AIMS Neuroscience and beyond. AIMS Neuroscience, 1, 4–17. https://doi.org/10.3934/neuroscience.2014.1.4.
Cook, B. G., & Therrien, W. J. (2017). Null effects and publication bias in special education research. Behavioral Disorders, 42, 149–158. https://doi.org/10.1177/0198742917709473.
Cook, B. G., Lloyd, J. W., Mellor, D., Nosek, B. A., & Therrien, W. J. (2018). Promoting open science to increase the trustworthiness of evidence in special education. Exceptional Children, 85(1), 104–118. https://doi.org/10.1177/0014402918793138.
Cribb, J., & Sari, T. (2010). Open science: Sharing knowledge in the global century. Collingwood: CSIRO Publishing.
Czarnitzki, D., Grimpe, C., & Pellens, M. (2015). Access to research inputs: Open science versus the entrepreneurial university. Journal of Technology Transfer, 40(6), 1050–1063.
David, P. A. (1998). Common agency contracting and the emergence of ‘open science’ institutions. American Economic Review, 88(2), 15–21.
David, P. A. (2004a). Understanding the emergence of ‘open science’ institutions: Functionalist economics in historical context. Industrial and Corporate Change, 13(4), 571–589. https://doi.org/10.1093/icc/dth023.
David, P. A. (2004b). Can “Open Science” be protected from the evolving regime of IPR protections? Journal of Institutional and Theoretical Economics, 160(1), 9–34.
David, P. A. (2007). The Historical Origins of ‘Open Science’. An essay on patronage, reputation, and common agency contracting in the scientific revolution. Stanford, CA: Stanford University.
Deming, W. E. (1982). Quality, productivity and competitive position. Cambridge, MA: MIT.
Deming, W. E. (1986). Out of the Crisis. Cambridge, MA: MIT.
Donabedian, A. (1980). The definition of quality and approaches to its assessment [Explorations in Quality Assessment and Monitoring, vol. 1]. Ann Arbor: Health Administration Press.
Dorch, B. (2012). On the citation advantage of linking to data. Retrieved from https://hal-hprints.archives-ouvertes.fr/hprints-00714715/document.
Dosemagen, S., Liboiron, M., & Molloy, J. (2017). Gathering for open science hardware 2016. Journal of Open Hardware, 1(1), 4. https://doi.org/10.5334/joh.5.
Ebersole, C. R., Atherton, O. E., Belanger, A. L., Skulborstad, H. M., Allen, J. M., Banks, J. B., et al. (2016). Many labs 3: Evaluating participant pool quality across the academic semester via replication. Journal of Experimental Social Psychology, 67, 68–82. https://doi.org/10.1016/j.jesp.2015.10.012.
European Commission. (2015). Study on open science. Impact, implications and policy options. Brussels: European Commission. Retrieved from https://ec.europa.eu/research/innovation-union/pdf/expert-groups/rise/study_on_open_science-impact_implications_and_policy_options-salmi_072015.pdf.
Fanelli, D. (2010). “Positive” results increase down the hierarchy of the sciences. PLoS ONE, 5(4), e10068. https://doi.org/10.1371/journal.pone.0010068.
Fecher, B., & Friesike, S. (2014). Open science: One term, five schools of thought. In S. Bartling, & S. Friesike (Eds.), Opening science (pp. 17–47). Cham: Springer. https://doi.org/10.1007/978-3-319-00026-8_2.
Ferguson, C. J., & Heene, M. (2012). A vast graveyard of undead theories: Publication bias and psychological science’s aversion to the null. Perspectives on Psychological Science, 7, 555–561. https://doi.org/10.1177/1745691612459059.
Feyerabend, P. (1978). Science in a free society. London: New Left Books.
Franco, A., Malhotra, N., & Simonovits, G. (2014). Publication bias in the social sciences: Unlocking the file drawer. Science, 345, 1502–1505. https://doi.org/10.1126/science.1255484.
Gilmore, R. O., Kennedy, J. L., & Adolph, K. E. (2018). Practical solutions for sharing data and materials from psychological research. Advances in Methods and Practices in Psychological Science, 1, 121–130. https://doi.org/10.1177/2515245917746500.
Giner-Sorolla, R. (2012). Science or art? How aesthetic standards grease the way through the publication bottleneck but undermine science. Perspectives on Psychological Science, 7, 562–571. https://doi.org/10.1177/1745691612457576.
Goodman, S. N., Fanelli, D., & Ioannidis, J. P. A. (2016). What does research reproducibility mean? Science Translational Medicine, 8(341), 341ps12. https://doi.org/10.1126/scitranslmed.aaf5027.
Greenwald, A. G. (1975). Consequences of prejudice against the null hypothesis. Psychological Bulletin, 82, 1–20. https://doi.org/10.1037/h0076157.
Gunsalus, C. K., & Robinson, A. D. (2018). Nine pitfalls of research misconduct. Nature, 557, 297–299. https://doi.org/10.1038/d41586-018-05145-6.
Hattie, J. A. C. (2008). Visible learning. A synthesis of over 800 meta-analyses relating to achievement. London & New York: Routledge.
Hecker, B. L. (2017). Four decades of open science. Nature Physics, 13(6), 523–525. https://doi.org/10.1038/nphys4160.
Henneken, E. A., & Accomazzi, A. (2011). Linking to data: Effect on citation rates in astronomy. Retrieved from http://arxiv.org/abs/1111.3618.
Higgins, K. (2016). Post-truth: A guide for the perplexed. Nature News, 540(7631). https://doi.org/10.1038/540009a.
Horton, R. (1997). Pardonable revisions and protocol reviews. The Lancet, 349, 6.
Inter-University Consortium for Political and Social Research. (2012). Guide to social science data preparation and archiving: Best practice throughout the data life cycle (5th ed). Ann Arbor, MI: Author. Retrieved from https://www.icpsr.umich.edu/files/deposit/dataprep.pdf.
Ioannidis, J. P. (2012). Why science is not necessarily self-correcting. Perspectives on Psychological Science, 7, 645–654. https://doi.org/10.1177/1745691612464056.
John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23, 524–532. https://doi.org/10.1177/0956797611430953.
Juran, J. M. (Ed.). (1951). Quality control handbook. New York: McGraw-Hill.
Juran, J. M. (1992). Juran on quality by design. The new steps for planning quality into goods and services. New York: Free Press.
Kagan, J. (2009). The three cultures: Natural sciences, social sciences, and the humanities in the 21st century. New York: Cambridge University Press.
Kidwell, M. C., Lazarević, L. B., Baranski, E., Hardwicke, T. E., Piechowski, S., Falkenberg, L. S., et al. (2016). Badges to acknowledge open practices: A simple, lowcost, effective method for increasing transparency. PLoS Biology, 14(5), e1002456. https://doi.org/10.1371/journal.pbio.1002456.
Klein, R. A., Ratliff, K. A., Vianello, M., Adams, R. B., Bahnik, Š., Bernstein, M. J., et al. (2014). Investigating variation in replicability. Social Psychology, 45, 142–152. https://doi.org/10.1027/1864-9335/a000178.
Kraker, P., Leony, D., Reinhardt, W., & Beham, G. (2011). The case for an open science in technology enhanced learning. International Journal of Technology Enhanced Learning, 3(6), 643–654. https://doi.org/10.1504/IJTEL.2011.045454.
Kronick, D. A. (1976). A history of scientific & technical periodicals: The origins and development of the scientific and technical press, 1665–1790. Metuchen, NJ: Scarecrow Press.
Kuhn, T. S. (1962). The structure of scientific revolutions. Chicago: University of Chicago Press.
Lakens, D., Adolfi, F. G., Albers, C. J., Anvari, F., Apps, M. A. J., Argamon, S. E., … Zwaan, R. A. (2017, September 18). Justify your alpha. PsyArXiv [pre-print]. https://doi.org/10.31234/osf.io/9s3y6.
The Lancet. (2015). Protocol review at The Lancet: 1997–2015. The Lancet, 386, 2456–2457.
Lasthiotakis, H., Kretz, A., & Sá, C. (2015). Open science strategies in research policies: A comparative exploration of Canada, the US and the UK. Policy Futures in Education, 13(8), 968–989. https://doi.org/10.1177/1478210315579983.
LeBel, E. P., McCarthy, R., Earp, B. D., Elson, M., & Vanpaemel, W. (2018). A unified framework to quantify the credibility of scientific findings. Advances in Methods and Practices in Psychological Science, 1(3), 389–402. https://doi.org/10.1177/2515245918787489.
Levenstein, M. C., & Lyle, J. A. (2018). Data: Sharing is caring. Advances in Methods and Practices in Psychological Science, 1, 95–103. https://doi.org/10.1177/2515245918758319.
Lin, T. (2012). Cracking open the scientific process. The New York Times, 16 January. Retrieved November 8, 2019, from https://www.nytimes.com/2012/01/17/science/open-science-challenges-journal-tradition-with-web-collaboration.html.
Macfarlane, B., & Cheng, M. (2008). Communism, universalism and disinterestedness: Re-examining contemporary support among academics for Merton’s scientific norms. Journal of Academic Ethics, 6(1), 67–78. https://doi.org/10.1007/s10805-008-9055-y.
Makel, M. C., & Plucker, J. A. (Eds.). (2017). Toward a more perfect psychology: Improving trust, accuracy, and transparency in research. Washington, DC: American Psychological Association.
Markoff, J. (2005). What the dormouse said: How the sixties counterculture shaped the personal computer industry. New York: Viking.
McBee, M. T., Makel, M. C., Peters, S. J., & Matthews, M. S. (2018). A call for open science in giftedness research. Gifted Child Quarterly, 62, 374–388. https://doi.org/10.1177/0016986218784178.
McKiernan, E. C., Bourne, P. E., Brown, C. T., Buck, S., Kenall, A., Lin, J., … Yarkoni, T. (2016). Point of view: How open science helps researchers succeed. Elife, 5, e16800. https://doi.org/10.7554/elife.16800.
McShane, B. B., Gal, D., Gelman, A., Robert, C., & Tackett, J. L. (2017). Abandon statistical significance. The American Statistician, 73(sup1), 235–245. https://doi.org/10.1080/00031305.2018.1527253.
Merton, R. K. (1973). The sociology of science: Theoretical and empirical investigations. Chicago: University of Chicago Press.
Merton, R. K. (1996). On social structure and science. Chicago: University of Chicago Press.
Meyer, M. N. (2018). Practical tips for ethical data sharing. Advances in Methods and Practices in Psychological Science, 1, 131–144. https://doi.org/10.1177/2515245917747656.
Miguel, E., Camerer, C., Casey, K., Cohen, J., Esterling, K. M., Gerber, A., et al. (2014). Promoting transparency in social science research. Science, 343, 30–31. https://doi.org/10.1126/science.1245317.
Mirowski, P. (2018). The future(s) of open science. Social Studies of Science, 48(2), 171–203. https://doi.org/10.1177/0306312718772086.
Mitroff, I. I. (1974). Norms and counter-norms in a select group of the Apollo moon scientists: A case study of the ambivalence of scientists. American Sociological Review, 39, 579–595. https://doi.org/10.2307/2094423.
Molloy, J. C. (2011). The open knowledge foundation: Open data means better science. PLoS Biology, 9(12), e1001195. https://doi.org/10.1371/journal.pbio.1001195.
Mondada, F. (2017). Can robotics help move researchers toward open science? [From the Field]. IEEE Robotics and Automation Magazine, 24(1), 111–112. https://doi.org/10.1109/MRA.2016.2646118.
Munafo, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie, du Sert, N., … & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1), 0021. https://doi.org/10.1038/s41562-016-0021.
Murray-Rust, P. (2008). Open data in science. Serials Review, 34(1), 52–64. https://doi.org/10.1016/j.serrev.2008.01.001.
Muster, S. (2018). Arctic freshwater—A commons requires open science. In Arctic summer college yearbook (pp. 107–120). New York, NY: Springer.
Nosek, B. A., & Bar-Anan, Y. (2012). Scientific utopia: I. Opening scientific communication. Psychological Inquiry: An International Journal for the Advancement of Psychological Theory, 23, 217–243. https://doi.org/10.1080/1047840X.2012.692215.
Nosek, B. A., & Errington, T. M. (2017). Making sense of replications. ELife, 6, e23383. https://doi.org/10.7554/eLife.23383.
Nosek, B. A., & Lakens, D. (2014). Registered reports: A method to increase the credibility of published results. Social Psychology, 45(3), 137–141. https://doi.org/10.1027/1864-9335/a000192.
Nosek, B. A., Spies, J. R., & Motyl, M. (2012). Scientific utopia: II. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science, 7, 615–631. https://doi.org/10.1177/1745691612459058.
Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., et al. (2015). Promoting an open research culture. Science, 348(6242), 1422–1425. https://doi.org/10.1126/science.aab2374.
Nuijten, M. B., Hartgerink, C. H., van Assen, M. A., Epskamp, S., & Wicherts, J. M. (2016). The prevalence of statistical reporting errors in psychology (1985–2013). Behavior Research Methods, 48, 1205–1226. https://doi.org/10.3758/s13428-015-0664-2.
Odom, S. L., Brantlinger, E., Gersten, R., Horner, R. H., Thompson, B., & Harris, K. R. (2005). Research in special education: Scientific methods and evidence-based practices. Exceptional Children, 71, 137–148. https://doi.org/10.1177/001440290507100201.
OECD. (2015). Making open science a reality, OECD science, technology and industry policy papers. Paris: OECD Publishing (No. 25). https://doi.org/10.1787/5jrs2f963zs1-en.
Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aac4716. https://doi.org/10.1126/science.aac4716.
Page, M. J., Altman, D. G., Shamseer, L., McKenzie, J. E., Ahmadzai, N., Wolfe, D., et al. (2018). Reproducible research practices are underused in systematic reviews of biomedical interventions. Journal of Clinical Epidemiology, 94, 8–18. https://doi.org/10.1016/j.jclinepi.2017.10.017.
Peters, M. A., & Roberts, P. (2012). The virtues of openness: Education, science, and scholarship in the digital age. Boulder, CO: Paradigm Publishers.
Phelps, L., Fox, B. A., & Marincola, F. M. (2012). Supporting the advancement of science: Open Access publishing and the role of mandates. Journal of Translational Medicine, 10, 13. https://doi.org/10.1186/1479-5876-10-13.
Piwowar, H. A., & Vision, T. J. (2013). Data reuse and the open data citation advantage. PeerJ, 1, e175. https://doi.org/10.7717/peerj.175.
Piwowar, H. A., Day, R. S., & Fridsma, D. B. (2007). Sharing detailed research data is associated with increased citation rate. PLoS ONE, 2(3), e308. https://doi.org/10.1371/journal.pone.0000308.
Piwowar, H., Priem, J., Larivière, V., Alperin, J. P., Matthias, L., Norlander, B., et al. (2018). The state of OA: A large-scale analysis of the prevalence and impact of Open Access articles. PeerJ, 6, e4375. https://doi.org/10.7717/peerj.4375.
Popper, K. (1959). The logic of scientific discovery. Abingdon-on-Thames: Routledge.
Poupon, V., Seyller, A., & Rouleau, G. A. (2017). The Tanenbaum Open Science Institute: Leading a paradigm shift at the Montreal Neurological Institute. Neuron, 95(5), 1002–1006. https://doi.org/10.1016/j.neuron.2017.07.026.
Pridemore, W. A., Makel, M. C., & Plucker, J. A. (2017). Replication in criminology and the social sciences. Annual Review of Criminology, 1, 19–38. https://doi.org/10.1146/annurev-criminol-032317-091849.
Randall, D., & Welser, C. (2018). The irreproducibility crisis of modern science: Causes, consequences, and the road to reform. New York, NY: National Association of Scholars. Retrieved from www.nas.org/images/documents/irreproducibility_report/NAS_irreproducibilityReport.pdf.
Royal Society. (2012). Science as an open enterprise. London: Royal Society. Retrieved November 8, 2019, from https://royalsociety.org/~/media/Royal_Society_Content/policy/projects/sape/2012-06-20-SAOE.pdf.
Rufai, R., Gul, S., & Shah, T. A. (2012). Open access journals in library and information science: The story so far. Trends in Information Management, 7(2), 218–228.
Sakaluk, J. K., & Graham, C. A. (2018). Promoting transparent reporting of conflicts of interests and statistical analyses at the Journal of Sex Research. Journal of Sex Research, 55, 1–6. https://doi.org/10.1080/00224499.2017.1395387.
Sandy, H. M., Mitchell, E., Corrado, E. M., Budd, J., West, J. D., Bossaller, J., et al. (2017). Making a case for open research: Implications for reproducibility and transparency. Proceedings of the Association for Information Science and Technology, 54(1), 583–586. https://doi.org/10.1002/pra2.2017.14505401079.
Schmidt, S. (2009). Shall we really do it again? The powerful concept of replication is neglected in the social sciences. Review of General Psychology, 13, 90–100. https://doi.org/10.1037/a0015108.
Schymanski, E. L., & Williams, A. J. (2017). Open Science for identifying “known unknown” chemicals. Environmental Science and Technology, 51(10), 5357–5359. https://doi.org/10.1021/acs.est.7b01908.
Shavelson, R. J., & Towne, L. (Eds.). (2002). Scientific research in education. Washington, DC: National Academy Press.
Shibayama, S. (2015). Academic commercialization and changing nature of academic cooperation. Journal of Evolutionary Economics, 25(2), 513–532.
Sidler, M. (2014). Open science and the three cultures: Expanding open science to all domains of knowledge creation. In S. Bartling, & S. Friesike (Eds.), Opening science (pp. 81–85). Cham: Springer. https://doi.org/10.1007/978-3-319-00026-8_5.
Silberzahn, R., Uhlmann, E. L., Martin, D. P., Anselmi, P., Aust, F., Awtrey, E. C., … Nosek, B. A. (2017). Many analysts, one dataset: Making transparent how variations in analytical choices affect results, 1(3), 337–356. https://doi.org/10.1177/2515245917747646.
Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22, 1359–1366. https://doi.org/10.1177/0956797611417632.
Simons, S. J., Shoda, Y., & Lindsay, D. S. (2017). Constraints on generality (COG): A proposed addition to all empirical papers. Perspectives on Psychological Science, 12, 1123–1128. https://doi.org/10.1177/1745691617708630.
Stallman, R. (2005). Copyright and globalization in the age of computer networks. In R. A. Ghosh (Ed.), CODE: Collaborative ownership and digital economy (pp. 317–335). Cambridge, Mass.: MIT Press.
Sterling, T. D. (1959). Publication decisions and their possible effects on inferences drawn from tests of significance—or vice versa. Journal of the American Statistical Association, 54(285), 30. https://doi.org/10.2307/2282137.
Stracke, C. M. (2006). Process-oriented quality management. In U.-D. Ehlers & J. M. Pawlowski (Eds.), Handbook on quality and standardisation in e-learning (pp. 79–96). Berlin: Springer. https://doi.org/10.1007/3-540-32788-6_6. Retrieved from http://opening-up.education/publications/stracke-c-m-2006-process-oriented-quality-management.
Stracke, C. M. (2011). Competences and skills in the digital age: Competence development, modelling, and standards for human resources development. In E. García-Barriocanal et al. (Eds.), Communications in computer and information science (Vol. 240, pp. 34–46). Berlin: Springer. https://doi.org/10.1007/978-3-642-24731-6_4.
Stracke, C. M. (2014). How innovations and competence development support quality in lifelong learning. The International Journal for Innovation and Quality in Learning (INNOQUAL), 2(3), 35–44. https://doi.org/10.5281/zenodo.3608669.
Stracke, C. M. (2017a). The quality of MOOCs: How to improve the design of open education and online courses for learners? In P. Zaphiris, & A. Ioannou (Eds.), Learning and collaboration technologies. Novel learning ecosystems. Lecture Notes in Computer Science, (Vol. 10295, pp. 285–293). https://doi.org/10.1007/978-3-319-58509-3_23.
Stracke, C. M. (2017b). The quality of open online education and learning: A quality reference framework for MOOCs. In C. M. Stracke, M. Shanks, & O. Tveiten (Eds.), Smart universities: Education’s digital future. Official Proceedings of the International WLS and LINQ Conference 2017 (pp. 97–105). https://doi.org/10.6084/m9.figshare.9272657.
Stracke, C. M. (2017c). Open education and learning quality: The need for changing strategies and learning experiences. In Proceedings of 2017 IEEE Global Engineering Education Conference (EDUCON) (pp. 1044–1048). https://doi.org/10.1109/educon.2017.7942977.
Stracke, C. M. (2018a). 开放教育的学习质量和设计: OpenEd 框架 [The Learning Quality and Design of Open Education. The OpenEd Framework (translated by Junhong Xiao)]. Distance Education in China, 11, 5–18 + 78. http://cnki.net/kcms/doi/10.13541/j.cnki.chinade.20181108.005.html.
Stracke, C. M. (2018b). Como a Educação Aberta pode melhorar a qualidade de aprendizagem e produzir impacto em alunos, organizações e na sociedade? [ How can Open Education improve learning quality and achieve impact for learners, organizations and in society?] In M. Duran, T. Amiel, & C. Costa (Eds.), Utopias and Distopias da Tecnologia na Educação a Distância e Aberta (pp. 499–545). Campinas: & Niterói: UNICAMP & UFF. Retrieved from http://opening-up.education/wp-content/uploads/2018/06/Stracke_2018_Educacao_Aberta_Qualidade_Impacto.pdf.
Stracke, C. M. (2019). Quality frameworks and learning design for open education. The International Review of Research in Open and Distributed Learning, 20(2), 180–203. https://doi.org/10.19173/irrodl.v20i2.4213.
Therrien, W. J., & Cook, B. G. (2018). Null effects and publication bias in learning disabilities research. Learning Disabilities Research & Practice, 33, 5–10. https://doi.org/10.1111/ldrp.12163.
UNESCO. (2019a). Draft recommendation on open educational resources. Retrieved from https://unesdoc.unesco.org/ark:/48223/pf0000370936?posInSet=22&queryId=304ed6aa-5635-4d73-aefd-92ed93ae3c48.
UNESCO. (2019b). UNESCO recommendation on open educational resources. 40 C/32 Annex. Paris: UNESCO. Retrieved from http://opening-up.education/wp-content/uploads/2019/12/RECOMMENDATION-CONCERNING-OPEN-EDUCATIONAL-RESOURCES.pdf.
van der Zee, T., & Reich, J. (2018). Open education science. AERA Open, 4(3), 1–15. https://doi.org/10.1177/2332858418787466.
Vazire, S. (2018). Implications of the credibility revolution for productivity, creativity, and progress. Perspectives on Psychological Science, 13, 411–417. https://doi.org/10.1177/1745691617751884.
Vicente-Saez, R., & Martinez-Fuentes, C. (2018). Open science now: A systematic literature review for an integrated definition. Journal of Business Research, 88, 428–436. https://doi.org/10.1016/j.jbusres.2017.12.043.
Vision, T. J. (2010). Open data and the social contract of scientific publishing. BioScience, 60(5), 330–331. https://doi.org/10.1525/bio.2010.60.5.2.
von Hippel, E. (2005). Democratizing innovation. The evolving phenomenon of user innovation. Journal für Betriebswirtschaft, 55(1), 63–78.
Wicherts, J. M. (2017). The weak spots in contemporary science (and how to fix them). Animals, 7(12), 90. https://doi.org/10.3390/ani7120090.
Wicherts, J. M., Veldkamp, C. L., Augusteijn, H. E., Bakker, M., Van Aert, R., & Van Assen, M. A. (2016). Degrees of freedom in planning, running, analyzing, and reporting psychological studies: A checklist to avoid p-hacking. Frontiers in Psychology, 7, 1832. https://doi.org/10.3389/fpsyg.2016.01832.
Wilkinson, M. D., Dumontier, M., Aalbersberg, I. J., Appleton, G., Axton, M., Baak, A., et al. (2016). The FAIR guiding principles for scientific data management and stewardship. Scientific Data, 3, 160018. https://doi.org/10.1038/sdata.2016.18.
Willinsky, J. (2005, August 1). The unacknowledged convergence of open source, open access, and open science. First Monday, 10(8). Retrieved from https://journals.uic.edu/ojs/index.php/fm/article/view/1265/1185.
Huebner, G. M., Nicolson, M. L., Fell, M. J., Kennard, H., Elam, S., Hanmer, C., … Shipworth, D. (2017). Are we heading towards a replicability crisis in energy efficiency research? A toolkit for improving the quality, transparency and replicability of energy efficiency impact evaluations. Retrieved from https://pdfs.semanticscholar.org/71e4/fde85949cf5f2d803657d6becfb080be1a57.pdf.
Ziman, J. (1994). Prometheus bound. Science in a dynamic steady state. Cambridge: Cambridge University Press.
Ziman, J. (2000). Real science: What it is, and what it means. Cambridge: Cambridge University Press.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
Copyright information
© 2020 The Author(s)
About this chapter
Cite this chapter
Stracke, C.M. (2020). Open Science and Radical Solutions for Diversity, Equity and Quality in Research: A Literature Review of Different Research Schools, Philosophies and Frameworks and Their Potential Impact on Science and Education. In: Burgos, D. (eds) Radical Solutions and Open Science. Lecture Notes in Educational Technology. Springer, Singapore. https://doi.org/10.1007/978-981-15-4276-3_2
Download citation
DOI: https://doi.org/10.1007/978-981-15-4276-3_2
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-4275-6
Online ISBN: 978-981-15-4276-3
eBook Packages: EducationEducation (R0)