Introduction

Curriculum development in HE, despite variations across institutions, is an area where academics still maintain their professional autonomy (Louvel 2013), deciding both what to teach and how, so as to meet their aims. This article considers what might be taught to meet a widely held curriculum aim of students being able to understand research in a discipline. While the focus is on an understanding of research in science, the ideas may be pertinent to other disciplines, such as in applied and social sciences, where research gives weight to data as evidence.

Students’ understanding of research is incorporated in notions of ‘graduateness’ (Steur et al. 2012) and generic graduate attributes (Barrie 2006) as well as being of importance in critical thinking (Moore 2011) and for postgraduate research (Cumming 2010). The varied research–teaching links (Willison 2013) widely established in HE in many countries and extensively reviewed by Malcolm (2014) also attest to the widespread engagement of undergraduates with research which requires that they have an understanding of evidence within their discipline. Understanding research is a key feature of the characteristic ways of thinking and practicing in a discipline (WTPs; McCune and Hounsell 2005) and of disciplinary expertise, both of which are now central notions in curriculum development in HE (Baillie et al. 2013; Entwistle 2009; Kinchin and Cabot 2010; Meyer and Land 2003, 2005) and can, in Hay’s (2011) view, be described as ‘researcher-led’ teaching.

Kinchin and Cabot (2010:153) argue that ‘The development of university pedagogy needs to consider the nature of expertise and how this can be modelled for students in such a way that teaching reflects the values of each academic discipline’. This article contributes to this aim with regard to research expertise in scientific disciplines.

Understanding expertise

If expertise is to inform curriculum development, then understanding expertise is important. Expertise has been extensively researched (Ericsson et al. 2006) while Amirault and Branson (2006) have identified particular aspects of importance to education. Only a few points can be addressed here.

Expertise appears intuitive (Kinchin and Cabot 2010), develops through extensive experience (Shopkow 2010) and is considered to be hard to articulate since it draws on tacit knowledge (Polanyi 1966; Sternberg 1999). These characteristics would seemingly challenge attempts to teach students explicitly to develop expertise. Does this mean that experts’ ‘route to expertise is blocked for the student’ (Shopkow 2010:320)? Several authors (for instance, Bradley et al. 2006; Kinchin et al. 2008) suggest that tacit knowledge can be elicited.

WTP research, with its focus on disciplinary expertise and ‘thinking like an expert’, is closely associated with research into threshold concepts (TCs; Meyer and Land 2003, 2005)—concepts associated with expertise, which are integrative and transformative and may be troublesome to learn. TC research has been influential (see, for instance, Flanagan n.d.), focusing disciplinary academics’ recent attention on the concepts required for expertise and how they might be made explicit to students.

Expertise can take many forms. Bradley et al. (2006) focus on the tacit knowledge specifically associated with experts’ decision-making which underpins action. Articulation of the knowledge base used in decision-making is an important means of demystifying the thinking underpinning expert practice, a key aspect of curricula with a focus on deep understanding of disciplinary WTPs (Entwistle 2009). A validated and clearly integrated knowledge base should go some way to helping the curriculum developer.

Kinchin, Cabot and Hay and other colleagues (see reference list) have used concept mapping (based on Novak and Gowin 1984) as a means of eliciting experts’ (both teachers and researchers) understandings underpinning their practice, a technique that explicitly shows the propositional links between the concepts and that makes clear the integrative nature of expert thinking (Davies and Mangan 2007). Kinchin and colleagues have mainly studied in clinical contexts but have argued that their fundamental ideas are applicable to non-clinical university disciplines as well (Kinchin and Cabot 2010).

Their findings about the nature of expertise, from analysis of thousands of concept maps, are represented in Fig. 1. Experts’ work is often seen to involve relatively quick routes to a solution, which can be represented as ‘chains of practice’, but they show that, in experts, such seemingly linear chains are underpinned by a wider and integrated but often intuitive knowledge (Patel et al. 1999) that can be called on to inform practice, represented by a network of understanding. Kinchin et al. (2008) distinguish this underpinning expert knowledge from the application of the knowledge in practice (‘expertise’); the latter depending on the former.

Fig. 1
figure 1

A concept map of the dual-processing knowledge structures perspective on the nature of expertise (from Kinchin and Cabot 2010:161)

That such networks of ideas, the articulation of understanding—on the right of Fig. 1—are made explicit is important both for curriculum developers and for learners (Entwistle 2009; Hay 2007; Kinchin and Cabot 2010; Novak and Cañas 2007; Shopkow 2010; Cañas et al. 2015). The ideas can arguably form ‘the “know that” part of the curriculum’ such that students understand the ‘know how’ (Muller and Young 2014:137). Kinchin et al. (2011) have used concept maps to inform curriculum development. Since expertise includes a conceptual basis, it should make it amenable to specification, teaching and learning.

In summary, research has suggested that despite the apparently tacit nature of much of what constitutes expert practice, it is possible to analyse and articulate the ideas important for expert decision-making; and that underpinning the observable chains of practice—which are more evident in some disciplines than others (Kinchin et al. 2008)—there is an integrated network of concepts.

So now we need to consider specifically how expertise in scientific research might be framed so that we can help students develop this understanding.

Framing research expertise in scientific disciplines

Researchers in scientific disciplines—those disciplines that ‘allow direct empirical investigation of an important question’ (National Research Council 2002: 6), a description that focuses on the means of research rather than specifically what they are researching—carry out a multitude of diverse practices as they enquire into phenomena. These practices characterise the WTPs of the disciplines.

However, research practices vary across disciplines and also within them (McCune and Hounsell 2005), and not only because of the specific substantive knowledge required. How research is designed and conducted to solve different problems varies according to the disciplinary circumstances and conventions (for instance, using tightly controlled laboratory-based research; or the observation of phenomena; in surveys which suggest links between factors; or in randomised control trials [RCTs]) with concomitant effects on the validity of the data and the strengths of the claims made. Research may also differ in, inter alia, the equipment employed, the manual skills required and the specific techniques selected. This presents a challenge to curriculum developers. How can such diversity of research practice be conceptualised to aid teaching and learning?

The model in Fig. 1 will be developed with respect to research expertise in scientific disciplines to aid such conceptualisation, using examples from research literature as well as drawing on 20 years of research and teaching that I have conducted with colleagues and to whom I am indebted.

Chains of practice?

It is easy to get the impression of scientific practice as chains. Research practice in science after all may well draw on ‘routines’, those standardised techniques and protocols, which act as shortcuts to a solution and minimise ‘thinking’, other than in their selection. They have become established to make practice efficient and to ensure quality, whether that is how an instrument is deployed, a protocol for a procedure or the ‘rules’ for a specific approach, such as an RCT. These ‘routine chains’ and their appropriate selection are clearly important components of expertise—they represent part of the ‘practice’ in a discipline’s WTPs.

However, expert research practice involves much more than just competence of routine chains! But experts’ familiarity with the context of their research and their quick thinking as they make necessary modifications to their practice can give the superficial appearance of chain-like procedures as they seek solutions to their research problems; the solution being to establish a valid pattern in their data that will be ‘good enough’ for the claims to be made.

That practice appears to consist of chains on the left-hand side of Fig. 1 is strongly reinforced when formal written accounts of research represent research practice as linear.

It is unsurprising, therefore, that a widespread traditional teaching approach in science disciplines in HE (Kinchin et al. 2010) has been to provide students with opportunities to familiarise themselves and enact chains of practice through the use of structured pre-specified practical activities, often designed to be illustrative of substantive ideas. Students are directed, by means of ‘methods’, written by experts and using disciplinary conventions, through the (more or less) detailed linear stages of the practical activity. However, with so many potential chains, the selection of practicals becomes an issue, as do the constraints of time.

Another approach to help students has been to describe commonalities amongst the chains. Willison and O’Regan (2007:399) argue that despite the observed variation in chains ‘the fundamental facets of inquiry are, however, identical, with common processes being acted out across all research endeavours’. Such an approach underpins descriptions of research practice—in effect, generic descriptions of the steps in Fig. 1’s chains—which can be couched in terms that Muller and Young (2014) describe as ‘know how’Footnote 1 processes or skills. These are typified by terms like research skills or processes such as hypothesising, planning, designing, analysing, interpreting and evaluating, and they may be taught in research methods or statistics courses (Sotos et al. 2007). However, these descriptions of chains may provide a limited basis for teaching; the concepts required for the thinking to be able to do these processes are seldom specified, and in the absence of such guidance there is a danger that students revert to ‘copying’ practices they have seen before (Roberts et al. 2010). Descriptions of chains without the explicit specification of the conceptual basis to guide the teacher and learner are a poor basis for a curriculum specification that aims to develop expertiseFootnote 2 (Kinchin et al. 2008).

However, research with students into how open-ended science investigations are conducted (where the solution and routes to it are unknown to them) suggest that in science the appearance of practice as a chain may be deceptive (Roberts et al. 2010). Those with more sophisticated approaches to the research problem—working alone or in a group—employed an iterative rather than linear approach, with trials and decisions being made to refine the practice until eventually a valid pattern in the data was deemed ‘good enough’ for a claim to be made (Fig. 2). Only once the iterative decisions have been made is something more akin to a ‘protocol’ established which is usually what is then written up as a ‘method’.

Fig. 2
figure 2

A flow diagram showing an iterative approach to scientific research (from Gott and Murphy 1987:24)

For expert researchers, many of the solutions to the decisions outlined in Fig. 2 could already be familiar to them and others may be more or less quickly resolved (Kinchin et al. 2008), possibly giving the appearance of practice as a chain (Fig. 1). It may well take very detailed observation (such as can be found in Roth 2009, 2013) to reveal that individual experts’ chains of practice may be a misleading model of expert practice in scientific research.

Expertise can develop through extensive practice and with time (Feltovitch et al. 2006; Shopkow 2010) but, for the most part, undergraduate students do not have this luxury. The traditional hope that through exposure to many different chains, over time students would develop expertise and pick up the tacit network of understanding is not supported by evidence. Literature shows (see, for instance, Taylor and Meyer 2010; Wilson et al. 2010) that aspects of scientific research expertise indicative of network thinking—such as the ability to design research from scratch; to understand how particular chains of practice have developed; to understand how and why to amend research plans; or the ability to evaluate others’ research and the quality of the claims made—are poorly developed in students following such traditional HE practice and are also unlikely to have been developed during school science (Roberts et al. 2010).

Nets of ideas

Figure 1 represents expertise as being underpinned by a network of ideas that can be constructed to form an understanding, and Fig. 2 suggests that decision-making is an important element in scientific practice; but what is it that experts in scientific research understand so that they can make decisions like this?

Identification of a conceptual underpinning for research expertise is the premise of recent TC-inspired research in different scientific disciplines. In any subject, the specific substantive knowledge is going to be of central importance to any research endeavour. In addition, concepts associated with conducting or evaluating research in scientific disciplines have been identified—referred to by some (see, for instance, Perkins 2006; Mead and Gray 2010; Ross et al. 2010), as the subjects’ epistemes.

For instance, research in biology has identified ‘hypothesis construction and testing’ as a TC (Taylor et al. 2011). Taylor and Meyer (2010:179) explain what is signified by this term: ‘Inherent in this role of hypothesis testing is the broader consideration of the conceptualisation of experimental design and the role that variation in this design may play in terms of verifying and extending previous findings. The concept of hypothesis testing thus provides a framework for experimentation and investigations’. Wilson et al. (2010:100) identified ‘measurement uncertainty’ as a potential TC for understanding in physics—which they explained as ‘an understanding of how to identify different sources of uncertainty, quantify their effects, take those effects into account in planning experiments, analysing data and making logical inferences from those data, and an appreciation of the consequences of uncertainty’. Despite these TCs having different names, the conceptual basis of what is being described has a lot of common ground.

That their TCs transcend disciplinary boundaries has been recognised by the teams researching them (see, for instance, Ross et al. 2010; Wilson et al. 2010). Hall (2010), studying understanding in the interdisciplinary area of climate change, found that uncertainty in the data and its weight as evidence was a key component. Other science researchers have identified aspects of research expertise to be critical and often problematic for students; for instance, Ryan (2014) working in Earth and Environmental science recognised the determination of patterns in data and issues resulting from researching complexity and uncertainty as TCs. Understanding in some cross-disciplinary aspects of scientific research such as understanding of statistics (Gordon and Nicholas 2009) and quantitative numeracy (Frith and Lloyd 2013) is also shown to be dependent on understanding how the data have been derived.

Whether these understandings related to research expertise are TCs per se or not is not the issue here; instead, the focus in this article is on the network of concepts alluded to but not fully articulated in all these works that enable research expertise to be better understood—the net of ideas shown in Fig. 1.

In the UK, the Royal Society’s motto ‘Nullius in verba which roughly translates as “take nobody’s word for it”’ (Royal Society, n.d.) emphasises the central importance in science of evidence over assertion, as scientists make claims following investigations into the real world. All scientific research is judged on the quality of its data regardless of how that scientific practice is enacted. Ultimately such understanding is important both for the conduct of an investigation as well as other aspects of academic expertise, such as in the review of others’ work. These are the WTPs of scientific disciplines, but with the emphasis now on the ‘thinking’ about evidence.

That the ‘doing’ of science includes, inter alia, an understanding of evidence was recognised by Millar et al. (1994). So what is this knowledge base? Gott et al. (n.d.) offered a tentative specification of the concepts of evidence—the ‘thinking behind the doing’—which constitutes a knowledge base for such an understanding; this was further exemplified in Gott and Duggan (2003). Research with experts in HE and science-based industries has indicated that an understanding of these ideas validly represents the thinking they use in their scientific practice (Gott et al. 1999; Roberts and Gott 1999); while Tytler et al. (2001a, b) and Duggan and Gott (2002) have also shown them to be important in detailed case studies of the public engagement with science, the importance of which is clearly made by Goldacre (2011).

If the ‘thinking behind the doing’ is a knowledge base of concepts to be understood (in effect, no different to other concepts with which we are familiar in ‘substantive’ science), it ought to be possible to represent that understanding with a concept map. Roberts and Johnson (2015) have recently presented the interrelationships between some of the key constituent ideas required for decision-making as a concept map (Fig. 3) and have shown how evidence is inherently related to the more traditional substantive knowledge and theories of science.Footnote 3 This network of interconnected ideas about the quality of data, originally mapped as a conceptual basis for school science curriculum purposes, is, I suggest, also a tentative articulation of the net in Fig. 1 underpinning scientific research expertise.

Fig. 3
figure 3

A concept map with the focus question ‘What is the “thinking behind the doing” for determining the validity of data?’ (from Roberts and Johnson 2015:348). [Concepts directly informed by substantive knowledge are highlighted with a shadow on the box.]

Through illustration with a range of scientific practices from different disciplines, Roberts and Johnson (2015) have argued that all require interrelated and nuanced decisions using these ideas, regardless of the context of the work or the approach adopted, decisions that are essentially all directed towards establishing the validity of data and its subsequent weight as evidence for a claim. Readers are referred to the original paper for a full explanation in which contexts ranging from tightly controlled laboratory-based science, field trials, RCTs and ecological surveys are used to illustrate how the understanding of the validity of data, represented in Fig. 3, informs the decisions made in such approaches.

The emphasis of our work is on such an understanding. I will expand on this briefly next and consider its relationship with the observed chains of practice. The curriculum implications of the map will then be explored.

A concept map of evidence

Any specification necessarily requires the use of a language. Different approaches to research have, historically, developed specific terminologies, and the ‘territorial’ nature of research (Jones and Kinchin 2009) with specialised vocabularies is a challenge to the articulation of the concepts of evidence which, we believe, apply across disciplines. Green et al. (2014) refer to the notion of ‘signification’ (the relationship between a word and its meaning) and suggest that work on signification ‘may be especially useful when examining interdisciplinary fields’. In our original specification (Gott et al. n.d.) and in the explanation of the concept map (Roberts and Johnson 2015), we have illustrated the concepts—the signified—with examples from different fields of science to try to reduce any ambiguity associated with the terminology (the signifier). Our use of terms is intended to be ‘neutral’, with weak ‘semantic gravity’ (Maton 2009) and does not privilege any one method or approach.

The vocabulary associated with measurement on the right-hand side of the map is a case in point. Categoric (qualitative) or continuous (quantitative) values all require ‘measurement’ (although often referred to as ‘observation’; see, for instance Gray 2014). All measurements require an ‘instrument’, a term usually associated with the measurement of continuous variables. For categoric variables where the measurement entails the recognition of the defining features of the variable—identifying specimens of a species, for instance—the substantively informed discernment of the observer acts like an instrument. The reliability of any measurement depends on its degree of uncertainty; vocabulary familiar to users of conventional instruments but taken, in the map, to also include uncertainties in identification and classification of categoric variables.

Figure 3 indicates (by means of shadow on the box) those concepts that are directly informed by substantive knowledge. The map emphasises the intimate integration of substantive knowledge with scientific practice. Neither stands alone, each is only as good as the other. The production of data is conceived within, is guided by and uses instruments that depend on existing substantive knowledge. The soundness of substantive knowledge depends on the quality of the originating data as evidence. They are inextricably bound (which has curriculum implications to which we will return later), yet in this article the focus is on the understanding of the quality of data.

Our map focuses on the ideas involved in carrying out a ‘whole’ scientific investigationFootnote 4 from initial observation to the resultant claim and its position in the broader substantive theory. The map centralises the question of the validity of data since the confidence in the validity in any research practice gives it weight as evidence for a claim—it is this that all investigators are striving for, regardless of what they are researching or what conventions they are adopting, and is at the forefront of expert researchers’ thinking whether they are researching in a laboratory or the field, doing ‘classical experiments’ or ‘observational study’ (Gray 2014). The many different approaches to research and the resultant validity of the data and strength for a claim (Hodson and Wong 2014) can be viewed as the consequence of differences in the nature of the two interrelated sides of the map: in the variables involved and their measurement (as discussed above) in any investigation. Across scientific disciplines and in different experimental contexts the degree to which variables can be isolated and their values manipulated, and the amount of variation in a defined variable (all of which need measuring) influence how relationships are sought and the strength of any resultant claims.

Fundamentally the map draws attention to the interplay between the main concepts that have to be considered and on which decisions are made, regardless of the nature of the variables or the approach taken. The nuanced decisions involve trade-offs and contingencies according to the circumstances of the situation. This is explained in Roberts and Johnson (2015) and is exemplified in the very technically and conceptually simple context of Fig. 4. Decisions are made during the investigation ‘looking forward’ to ensure the quality of the data and ‘looking back’ on data as they are collected and as the basis for further decisions (Gott and Duggan 2003). How this is achieved will differ according to the context of the research and the practical circumstances but the decisions made will be with respect to this aim. The annotations in Fig. 4 point to the nuanced application of the ideas in the concept map (Fig. 3) as juxtapositions and contingencies are considered according to the context. Other annotated ‘worked examples’ can be found at Gott et al. (n.d.). The substantive simplicity of Fig. 4 is deliberate, as a way of highlighting the thinking that requires the understanding represented on the concept map. The same ideas are there in the more substantively challenging contexts in HE: the shadowed concepts in Fig. 3 are all substantively informed. (An increase in substantive complexity increases the conceptual demand in the investigation. In Fig. 4, the context was simplified by not requiring thought about ideas from the left of the concept map, about variation in the object/thing, since only a single bottle was investigated. The conceptual demand also increases in different research contexts where understanding of other areas of the map is drawn on to produce valid data: such as when the repeatability of the DV is low; where the magnitude of the change in values of the DV is small for changes in the value of the IV; where measurement of the DV is less straightforward; and where confounding variables cannot be manipulated but require matching (Johnson and Roberts, in press)).

Fig. 4
figure 4

Annotated extracts from a school student’s account written to make explicit their ‘thinking behind the doing’ (Gott et al. 1999:71–74). [The substantive demand in this example is very low, as is the practical context, so that the concepts of evidence are the focus of this illustration.]

That expert practice involves an understanding of the quality of data—even if ‘doing’ research may appear to involve procedural chains—perhaps becomes more obvious when we consider the understandings experts draw upon when evaluating research. In both informal and formal peer review, others’ ‘routes to a solution’ are judged. Were the decisions made—during every aspect of the research—‘good enough’ for the claims the authors made? Such evaluation is not based on chains of practice of ‘doing’ but requires the ‘thinking behind the doing’ represented by the concept map.

Chains from the net

Figure 2’s model of scientific research as involving trialling and iterative decision-making suggests that the appearance of expert practice as a linear chain (the left of Fig. 1) may not always fit with detailed observations. However, both models point to expert practice being underpinned by an understanding that enables decisions to be made. Figure 3 attempts to articulate this network as an understanding of the quality of data.

The concept map (Fig. 3) represents the conceptual basis of this—the ‘thinking behind the doing’—not a series of procedures, processes or methods. It is not a flow diagram of the sort that is often associated with descriptions of scientific practices nor does it represent any particular pattern of reasoning (Cleland 2002). The focus is on the concepts and emergent understanding we suggest are required to make decisions. The arrow directions in conjunction with the linking terms are there to represent the propositional relationships which give meaning to the concepts and do not imply a procedural or reasoning sequence. The map represents a network of linked ideas, and it is the ‘joined up thinking’ and mental ‘juggling’ of these linked ideas that demonstrates higher-level thinking about research practice which is manifest in the iterative practice of Fig. 2 (even if, as discussed earlier, it may have the superficial appearance of the chain in Fig. 1). The map may go some way to articulating the understandings common to the different disciplinary TCs discussed earlier.

In established chain-like practices, whether ‘routines’ or the ‘methods’ provided in traditional laboratory manuals, all the decisions about the design to establish valid data have already been made. The ‘thinking behind the doing’ went into creating the chain so that practice can be standardised and little further thinking is required. For instance, in illustrative practicals to demonstrate a substantive concept, in contexts where variables can be manipulated, matters can be contrived to give a very small variation in comparison with the effect of changing the independent variable so that only a few (if any) repeated measurements of the dependent variable suffices (for example, as in Royal Society of Chemistry n.d.; Nuffield Foundation n.d.). In summary, Fig. 3 shows a map of the ‘thinking behind the doing’, whatever form the ‘doing’ might take. Its main purpose is in its being a concept map and so emphasising that scientific practice is about ideas to be understood and that can be specified—just like substantive knowledge—and that its teaching should follow accordingly. This, I propose, goes some way to distinguishing Kinchin et al.’s (2008) ‘underpinning expert knowledge’, and it is to the implications for a curriculum that aims to teach this that we now turn.

Implications

Shopkow (2010:324) reminds us that ‘until the practitioner is clear about the schemas he or she employs, it is difficult to help students construct them’. Viewing evidence as having a conceptual knowledge base to be understood rather than practice involving skills or processes to be mastered represents an ontological shift in the characterisation of scientific practice and has implications for the curriculum. Muller and Young (2014:136–137), in a paper critical of the potential loss of expert knowledge in ‘can do’ university curricular outcomes, capture this:

The …various kinds of ‘know how’ supplement and depend upon the ‘know that’ or conceptual knowledge. They do not replace it … Crucially, the ‘know how’ abilities are dependent on the conceptual knowledge of the domain concerned in all but a small number of mechanical skills and techniques. The ‘skills talk’ that most worries us is that form of discourse which pays lip service to the importance of knowledge but then goes on to concentrate almost entirely on the ‘know how’ requirements of the curriculum. This has the effect of shoehorning the ‘know that’ part of the curriculum into a ‘know how’ box, which obscures the curriculum requirements of the conceptual knowledge—its requirements for sequence, pace, progression and level of difficulty.

As an interviewee points out (Shopkow 2010:324):

‘[Students] don’t have a map. Of how to understand the different parts of the puzzle or the complexity you want them to understand and they have this understanding that this should be a straight forward type of story and they don’t have a map to navigate themselves around through this sea of information’.

Kinchin et al. (2010:85) advocate concept maps as a way of granting students ‘epistemological access’ to the discipline. I propose that a map like that of Fig. 3 would go some way to address this and would also help students better understand the connections of the net to the chains of practice which they are familiar with from teaching (Kinchin and Cabot 2010) and may be relevant to other disciplines in which research gives weight to data as evidence. The map can act as a pedagogic and curriculum planning tool for the lecturer as well as being of value in a metalearning activity for students (Meyer et al. 2015).

Since there has been little research from this cognitive perspective, we have little data on how students take to these ideas or how a curriculum might best be structured. In our experience, undergraduates prior to teaching have some knowledge of these ideas, at least in isolation (Glaesser et al. 2009a) but they fail to ‘join them up’ to develop an understanding that can be readily applied (Roberts et al. 2010). But if explicitly taught the understanding on the map, they are more critical (Roberts and Gott 2010) and are better able to investigate (Roberts et al. 2010). The ideas and understanding in the map are not ‘rocket science’, but they are concepts that are missing from traditional science curricula.

We have found that there is the need for activities aimed at explicitly developing students’ understanding of the quality of data; i.e. the ideas in the map contributing to the validity of data. This has involved students carrying out their own investigations, enacting Fig. 2. We start with simple contexts (similar to those described in Roberts and Johnson 2015 and exemplified in Gott et al. (n.d.) chosen so that they do not make high demands on specialised substantive understanding and where the outcome is not part of prescribed disciplinary content, better still if the outcome is genuinely unknown (an approach also advocated by, for instance, Sternberg 1999; Baillie et al. 2013). This allows the focus to be on getting good enough data to make a claim and develops an understanding of the interplay between the ideas on the map.

Students can feel very disconcerted by such work—it feels so different from the linear practice with ‘right answers’ that they have become accustomed to (Roberts et al. 2010; Roberts, in press). Demonstration of an understanding of the interaction amongst the ideas in the map is seen when students carry out trials and work iteratively in response to the data—making nuanced decisions based on the understanding in the map as they work—which are not features common to usually chain-like illustrative practicals.

Since the focus is on learning ideas and developing understanding, non-practical teaching activities are also appropriate; in our experience, explicit teaching of ideas from distinct sections of the map, with ample opportunities for students to discuss the effects of potential decisions in relation to real data and the quality of claims (their own or others’—see, for example, Bennett 2014) is valuable (Roberts, in press).

Once students have developed this basic understanding in different contexts, they will, arguably, be in a better position to understand the diverse practices and conventions employed across the sciences (Hodson and Wong 2014). Science research is intimately bound with disciplinary knowledge, and an understanding of this, as well as an understanding of the quality of data, is necessary conditions for success (Glaesser et al. 2009b). Students, of course, need to understand disciplinary conventions associated with research in their discipline, but an understanding of the ‘thinking behind the doing’ will, arguably, demystify some of the conventions they find problematic (Taylor and Meyer 2010).

Conclusions

Entwistle (2009:88) reminds us that it is important to determine the ‘inner logic of the subject and its pedagogy’ since how lecturers ‘understand their subject affects how they explain it to their students’. The broad concern of this article is with the network of concepts of evidence that our work shows underpins a deep understanding, vital to expertise, about the quality of data in scientific disciplines where research gives weight to data as evidence. Conceptualising evidence as having a knowledge base represents an ontological shift compared with practice described as processes and has implications for the curriculum; there are ideas underpinning research expertise that can be specified and taught.

The map shows scientific research expertise as drawing on both the specialist disciplinary knowledge base and an understanding of the quality of data and that the concepts of evidence in turn are framed by that substantive knowledge. As such, this map can be seen as extending the disciplinary knowledge base of scientific disciplines. Further research both within and across disciplines is needed to better understand how the specification and teaching of these ideas can best enhance students’ understanding of research expertise.