Introduction

Table 1 provides an overview of laboratory report nomenclature in three countries:

Table 1 Lab report nomenclature generalized

Reading past the surface variation,Footnote 1 we see that each row in Table 1 has essentially the same flavor all the way across, as abstracted to and summarized in the rightmost column, called ‘Generic List’. The role of Table 1 is only prefatory: It is meant to suggest a way of justifying the generalization to four generic categories—Procedure, Observations, Data, Analysis—in Fig. 1 below. (I.e., the four headings in Fig. 1 correspond to rows 5–8 of the table.) Our discussion proper begins with Fig. 1.

Fig. 1
figure 1

Generic lab report schema with ‘givens’ brought to the surface. g  = a given. (Regarding the distribution of these givens, see discussion in the ensuing text.). Solid arrow An explicit given is copied from Procedure to Data and thence to Analysis (or solely from Data to Analysis, etc.). E.g., 22°C as room temperature. (For examples of various givens in context, see text following Fig. 2). Dotted arrow An implicit given in the Procedure section becomes an explicit given in the Analysis section. E.g., ‘consult a table of water vapor pressures’ or ‘consult the barometer on the wall’. Curved arrow An ad hoc given appears in the Analysis section. E.g., in the Analysis section, a value such as ‘M Sn = 118.7 g/mol’ might be cited ad hoc by the student herself. XXX and YYY text in the Procedure and Observations sections, only a kind of ‘wallpaper’ for our purposes here. DDD and AAA the data, as recorded in the Data section and processed in the Analysis section

Note that the data, represented by the D’s and A’s in Fig. 1, are in effect contaminated by something foreign: the random sprinkling of g ’s (representing givens). Strange to say, this silent commingling of givens with data is the norm for chemistry lab reports, not the exception.Footnote 2 We never know where exactly the givens will occur, only that they are sure to appear somewhere, as a free-wheeling ‘fifth flavor’ in a scheme that ostensibly has only four flavors: Procedure, Observations, Data, and Analysis.

Why be concerned about discriminating this fifth flavor and elevating it to such a high profile? First, because it is illogical: the fifth flavor clashes internally with chemistry’s own four-flavor scheme as defined above (and, if viewed from the vantage point of computer science with its crisp dichotomy of inputs versus outputs, such vagueness in a hard science is startling, not to say jarring). Second, and far more importantly, because it is these givens that determine the student’s ‘freedoms’ in the laboratory, one being the flip side of the other. And while the givens are introduced somewhat unpredictably, at least they can—with an effort—be rounded up as a coherent new category, whereas the ‘freedoms’ (as we’ll call them for the moment) are not articulated at all. In short, we need the givens to find the ‘freedoms’. (And while some of the freedoms alluded to here might once have been regarded as a kind of luxury or academic byway, given the Climate of Fear described in “Cautionary notes” below, all of them now appear in a new light, with implications for the future of chemistry.)

Assembled in the following list are various kinds of given, ranked from ‘soft’ to ‘hard’:

  • 22°C as room temperature

  • 17.5 mmHg as the vapor pressure of water at 20°C

  • ‘63 u’ as the relative atomic mass for copper, published by Cannizzaro in 1858; compare the bullet for ‘63.54 g’ below; see also the discussion following Fig. 3

  • an algorithm for determining N A via electrolysis

  • 6.022 × 1023 as the modern value for N A

  • ‘63.54 g’ as the molar mass of copper, from the modern periodic table (which shows a weighted average for copper-63 and copper-65, massed at 62.929 and 64.927 g/mol, with abundances of 69.17 and 30.83%); compare ‘63 u’ above

  • 1.602 × 10−19 as the elemental charge, q e

  • The Dulong-Petit Law: C × m ≈ 3R for common metalsFootnote 3

  • The First Law: Conservation of Energy

Thus, at one extreme, something might be treated as a given simply for convenience. In these cases, the exact value is less important than the act of choosing a value and sticking with it, e.g., 20°C for calibrating/using glassware (printed on the item itself), or 22°C as ‘room temperature’, by tribal knowledge. (Sometimes one suspects that the motivation for brevity is both student convenience and convenience for the author or instructor, as when the notion of a freezing point depression constant, K f, is presented as a simple fact of life, with scant explanation of its subtle mechanism. As examples, here are four textbook references, arranged from least helpful to most helpful in their coverage of freezing point depression: Ebbing (1993), p. 501; the word ‘Therefore’ is tossed in, as pseudo-explanation. Kotz et al. 2003, p. 578; a picture is tossed in, as pseudo-explanation. Moore et al. 2005, p. 743; an explanation is provided, but the language is terse and technical. Zumdahl 1989, p. 495; a full explanation is provided, written in the appropriate style.)

At the other extreme, we have cases such as the First Law of thermodynamics, which is called a Law precisely because it is apparently beyond the reach of human analysis—a given to everyone, not just to students in a hurry.Footnote 4 In-between those two extremes fall the cases motivated by avoidance of something mildly inconvenient (‘Shall we have them build their own barometer?’ ‘No’); or very inconvenient (‘Shall we have them first weigh out an arbitrary quantity of vapor à la Dumas, as explained so nicely in Thompson 2008, p. 264?’ ‘No, we’ll provide them with 0.2 g of gas and let them take it from there’); or highly impractical (‘Shouldn’t we have them do a side-experiment to determine for themselves the molar volume of an ideal gas at STP?’ ‘No’); or forbidden (cooking mercury in a retort à la Lavoisier); or virtually unthinkable (retracing Millikan’s path to the discovery of the elementary charge, q e, via several years’ worth of oil droplet experimentation).

While our viewpoint will be binary at the outset, it will not remain fixed (as a dichotomy): rather, the idea is to play with the line that separates the givens from the ‘freedoms’ (later called tangibles) and thereby explore different ways of experiencing an experiment, by shifting some of its constituent parts from one side of the line to the other (or by redefining an isolated part such that it may jump the line, as it were, this being a different means to the same end).

Redefining the laboratory experience in terms of givens and tangibles

To establish a point of reference, let’s look first at a procedure so rudimentary that it might be called the minimal chemistry experiment: measuring out 1 mole of H2O. I will present it in both an informal version (Fig. 2) and a formal version.

Fig. 2
figure 2

The minimal ‘experiment’ in its kitchen chemistry form

It would appear that there are two kinds of activity in this experiment—mental and physical: On the mental side, one is asked to understand and believe that the water in the glass will be comprised of 6 × 1023 molecules. On the physical side, one tips 4 teaspoons of tap water into a glass or graduated cylinder. There is nothing else to do. This binary notion of thinking versus doing can be seen in the formal version, too, although it is now somewhat obscured by the format:

PROCEDURE:

Givens

  • NA ≈ 6 × 1023 molecules/mol

  • The molar mass of hydrogen is 1 g/mol H

  • The molar mass of oxygen is 16 g/mol O

  • 1 g = 1 mL (reminder: true of H2O only)

  • 5 mL = 1 tsp (per dictionary definition)

  • Tip 4 teaspoons of tap water into a glass.

  • Determine the approximate number of water molecules in the glass.

DATA: None

OBSERVATIONS: The liquid is colorless, in contrast to large accumulations of water that might appear colored.

ANALYSIS:

The molar mass of H2O is (2 × M H) + M O = (2 × 1) + 16 = 18 g/mol H2O

(4 tsp H2O) × (5 mL/1 tsp) = (20 mL H2O) × (1 g/1 mL) = 20 g H2O

20 g H2O/(18 g/mol H2O) = 1.11 mol H2O, say 1 mol H2O

(1 mol H2O) × (6 × 1023 molecules/1 mol) = 6 × 1023 molecules H2O

Based on the above, it might seem reasonable to propose that all activities in the laboratory be labeled according to a mental/physical dichotomy, with a rigid immovable barrier between the two realms: either you are focusing your mind on grams, milliliters, and the like; or, you are doing something physical to a substance (‘Tip 4 teaspoons of tap water…’).

But on reflection we see that two or more of the presumed ‘mental’ items could be shifted easily enough into the ‘physical’ realm, so long as one were not in a hurry to finish. For instance, the student could physically verify that a gram of water really does occupy a volume of 1 mL and that 5 mL equate to 1 teaspoon (as promised by the dictionary). Also s/he could consult a periodic table to verify the two molar masses given, thus discovering that our hypothetical author has rounded 1 from 1.007 and 16 from 15.999 g/mol. Finally, one might even consider determining N A from scratch. At this point, we would have converted all of the activities originally deemed purely ‘mental’ into something else. But the ‘something else’ does not fit comfortably under the label ‘physical’. The proposed dichotomy is already starting to fail us.

Next we might try ‘passive/active’ instead of ‘mental/physical’. Looking at the informal version (Fig. 2) where the reader is simply told to write down the chain of logic that takes us from teaspoons to molecules, the term ‘passive’ works well enough (in a kitchen chemistry context, let’s say). But in the formal version, the Analysis section is surely not passive. And returning to the question of ‘scattered givens’ for a moment, one can imagine an instructor who considered these five givens trivial, in which case they would not appear under Procedure but under Data or under Analysis, having been supplied on the fly by the student herself/himself from class notes or from the textbook. In this scenario especially we see that ‘passive/active’ fails just as badly as ‘mental/physical’ did to describe the dynamics of the situation.

Still, as we enter the lab, some such dichotomy is sensed by all of us, depending on the particular context: mental/physical; cerebral/concrete; under glass/hands-on; black box/white box; passive/active; thinking/workingFootnote 5; the intellect/the handFootnote 6; the logic of research/the psychology of researchFootnote 7; and so on. And eventually one feels the need to adopt a single pair of terms that will embrace (or at least suggest) the whole broad range. The pair of terms that I advocate for this purpose is givens/tangibles. To accommodate the kinds of overlap and dissonance mentioned above, however, these terms will need extended definitions to supplement their intuitive meanings, as follows:

Givens defined

As expected, givens will refer to certain constants or conversion factors that might be provided by the author of the Procedure section (or by the student, in Data or Analysis, as discussed earlier). As used here, the term will also cover any kind of hardware/software black box, whether explicit or implicit, that is part of the laboratory setup.Footnote 8 By this definition, our givens would encompass a barometer mounted on the wall, a thermometer, or a spectrophotometer—these being examples of explicit black boxes. This definition would also encompass the periodic table, as an example of an implicit software black box: Any reference in a Procedure to the molar mass of a certain substance is implicitly a pointer to the periodic table, whose contents are a kind of ‘software’. We tend to regard the periodic table as untouchable—indeed, too much trouble to touch given the tedious arithmetic that stands behind a single molar mass such as ‘118.71 g’, the average for the ten nonradioactive isotopes of tin, weighted for percentage abundance (Serway and Jewett 2004, p. A-8). But one can imagine a perspective whereby it becomes worthwhile opening up this particular black box for inspection nonetheless. (See discussion following Fig. 3.) In short, everything will be grist for the mill, save the First Law.

Fig. 3
figure 3

Exercising the line between givens and tangibles. a The minimal experiment (here we represent the formal version that follows Fig. 2). b Determination of N A by electrolysis (=zoom on in a). Vertical lines In each panel, ‘the line’ represents an initial division into two seemingly rigid categories called givens and tangibles. A dashed vertical line shows an alternative location for ‘the line’, suggesting that the constituency for the two categories is flexible, contrary to how we tend to think of them at first. Arrows A right-to-left arrow ending at a dashed vertical line redefines the constituency of the givens and the tangibles en masse. In theory, we might even want to push “the line” all the way to the far left for a certain experiment, assuming one had the luxury of looking inside all the givens. A left-to-right arrow indicates a reclassification of an isolated item by “jumping over” the existing vertical line, this being a different means to the same end. Axes As for axes, the x-axis and y-axis both denote the same thing in these graphs: more freedom in the plus direction, less freedom in the minus direction. Hence, these graphs are in essence one-dimensional. The data happens to be mapped to a diagonal line in a two-dimensional area, but that is only for cosmetic reasons, to avoid piling the data vertically in a plain laundry list. How are a and b related? One picks up where the other leaves off, in the following sense: If we were to push ‘the line’ to the extreme left in a, we would have reclassified N A as a tangible, a value to be re-‘discovered’ or verified. How could that be done? As it happens, that is the object of the experiment represented by b: the re-‘discovery’ of N A. Thus, all of b may be read as a ‘zoom’ on the far left corner of a (shaded triangle)

Tangibles defined

As might be expected, tangible will refer to those parts of the lab experience that are sensory, not cognitive. These include audio, visual, tactile, olfactory and kinesthetic clues about substances.Footnote 9 This is the ‘getting our hands dirty’ part. But just as important, in the extended usage adopted here, the term tangible will also be the catchall for classifying any parts of the experimental setup or procedure that are not among the givens or black boxes. This includes parts over which the student has discretionary control (e.g., a range of durations might be permitted for a certain reaction), and likewise the parts where a student has certain freedoms, some of which may be exercised quite unconsciously, as when there is an explosion and broken glassware. (To the student, this was ‘an accident’ but to the laboratory instructor this was part of a conscious decision long ago—either [a] to allow for the possibility of certain kinds of ‘bad events’ as learning opportunities or [b] to write them out of the script.) Thus, a rough synonym for tangibles in this context is freedoms. E.g., the freedom to confirm that 1 mL of water really does equate to one gram of the liquid. Or the ‘freedom’ to stumble by mixing ingredients slowly-but-not-slowly-enough, such that a rubber stopper flies off explosively from the neck of its flask. Rigid though these categories sound at first, we find that the line between givens and tangibles is actually quite movableFootnote 10; and this is why we took some care in defining the two categories in the first place—so that we would be able to show how the line between them can be moved or leapt over. This business of ‘moving the line’ will be discussed in due course, in connection with Fig. 3, but first, a few more words about the role of imagination in chemistry.

Someone perusing the minimal experiment as presented in Fig. 2 could easily feel gypped by ‘too many givens, too few tangibles; too much that is passive, not enough that is active,’ and develop a yearning to ‘do a real experiment soon, where I can get my hands dirty, wrestle with some raw data, gather up some numbers to crunch.’ Thus, a chemistry procedure can fail (‘seem like nothing’ to some students) for one of two opposite reasons: either from too much stimulus, which may overwhelm the imagination leaving it no time to work (an idea to be developed later, in connection with Fig. 4) or from too little stimulus, which may cause the imagination to simply stall out for lack of interest, as suggested by the paucity of the setup in Fig. 2.

Fig. 4
figure 4

‘Too many’ tangibles in Determination of the Molar Volume of Nitrogen? *Apply manual pressure to stopper while swinging Flask A by its neck to ‘slowly and thoroughly’ rinse interior of vial with sulfamic acid solution. But what is slowly enough to prevent an explosion? I.e., the pressure inside Flask A might easily overwhelm even the full force of one’s thumb on the stopper. Then, if the stopper flies out, one must start over from scratch

Using the givens/tangibles paradigm

Above we described a situation where the student might rebel blindly against ‘too many givens’, without having a clear grasp of the situation, only a sense of frustration. By contrast, when someone has available the givens/tangibles paradigm as a tool, she can push on ‘the line’ with awareness of the broader landscape, as illustrated in Fig. 3. Moreover, the line might, in principle, be moved all the way to the left, as a declaration that one aspires to do everything from scratch in a certain experiment—so long as the prospect is not too outrageous, implying replication of Millikan’s oil droplet results for instance. We say ‘in principle’ not only to acknowledge the impracticality of certain paths but also to acknowledge that a freshman student is likely to possess zero patience/time/resources for any such detour from the assigned experiment. All energy tends to be focused on passing swiftly through general chemistry as a gateway subject, the portal to various glamor fields that promise wealth and/or glory. Thus, the ‘student’ we refer to above is admittedly somewhat rare or theoretical, one who might return to general chemistry to savor it, from a philosophy-of-science perspective and/or the aesthetic angle (in the manner of Root-Bernstein 2003, e.g.).

Conversely, one might move the line all the way to the right (see dashed vertical line in Fig. 4). In this scenario, the instructor says, in effect: ‘Many of our students find the logistics of this experiment tedious and distracting. Let’s make everything in it a given which, in practical terms means: Let’s engage a software developer to replace the whole shebang by an applet. That way our students will be better able to concentrate on the content of the experiment.’ (This counterexample is developed later, in 3 Arguments pro and contra computer simulation.)

As suggested by one of the arrows in Fig. 3b, if we sense something amiss with it, then the algorithm itself becomes a candidate for reclassification from given to tangible. The algorithm for Determination of Avogadro’s Number by Electrolysis is a case in point: N A = qM/nm q e, where ‘n’ is the number of electrons ionized per atom; ‘m’ is the mass of metal lost; ‘q e’ is elementary charge; ‘q’ is coulombs; and ‘M’ is molar mass.Footnote 11 Thus, the algorithm employs something quintessentially molar (M) to estimate the number, N A, that defines the mole itself. We’ve been exposed to this semi-legal verging-on-circular logic so often that we scarcely notice it anymore. Let’s see how/if its acceptance can be rationalized: True, ‘63.54’ in the periodic table may refer to something other than molar mass; it may refer to copper’s relative mass. The trouble is that the chemistry education establishment short-circuits any consideration of this topic and says, in effect, ‘Oh, molar mass, relative mass who cares? We’ll just write M and leave it ambiguous.’ That rationale would make the algorithm valid in a narrow (but ugly) legalistic sense, except that ‘M’ is immediately robbed of its potential for ambiguity when, taking his/her cue from the textbook, the student replaces ‘M’ by an expression such as ‘63.54 g/mol of Cu’ (thus breaking the symmetry in the wrong direction so to say). That’s the damning step which makes a mockery of the whole algorithm.

How do we free ourselves of the circular logic? The first step is to stop treating the algorithm as sacred, i.e., to be willing to move it to the right side of Fig. 3b and open it up for inspection, as it were. Next, recall that circa 1858 Stanislao Cannizzaro determined (by weighing copper chlorides and other substances in their vaporized or gaseous forms) that the relative atomic mass of copper is ‘63 u’. This in turn might suggest to us that a better term to use in the algorithm would be ‘RM’ for ‘relative mass’. So long as we begin with an honest statement such as ‘the RM for copper is 63 g’ (the Cannizzaro value), there is no harm if we later introduce its modern value 62.929 g/RMCu, or, from the periodic table, the multi-isotope weighted average, as 63.54 g/RMCu, for the sake of bringing both isotopes into the picture. The important thing is to begin on a solid footing, and only then introduce refinements (motivated by a desire to see one’s estimate of N A move a notch closer to its modern value). As remarked earlier, everything is grist for the givens/tangibles mill: even the periodic table, whose numeric annotations are convenient and precise but conducive to conceptual vagueness or inaccuracy in this context, thus serving us ill.

Arguments pro and contra computer simulation

The ‘church of nitrogen’ or ‘my glassware catastrophe’?

Bauer (1990) describes the situation where scientific progress is defined in large part by more and more black-boxing which, somewhat paradoxically, threatens to dilute scientific literacy and to wash out scientific enthusiasm. Accordingly, he advocates going ‘backward’ sometimes, in the white-box direction, lest students find themselves cut off from the very forces that drove science in the first place; but this means resisting the students’ demand for the most advanced (i.e., the most extensively black-boxed) apparatus which they feel they deserve in exchange for high tuition fees.

As implied by Fig. 3 above, my own outlook is essentially the same as Bauer’s, but here I’d like to play devil’s advocate for a moment and propose a case where one might be inclined to push the line the opposite direction, toward maximal black-boxing. Let’s say the students are asked to react an excess of sulfamic acid (as the zwitterion H3N+·SO3 ) with a gram of sodium nitrite (NaNO2) to find the molar volume of nitrogen (N2), the aim being to obtain a rough confirmation that the molar volume of an ideal gas is 22.4 L at Standard Temperature and Pressure (STP).

Figure 4 provides an overview of the givens/tangibles for this experiment, Determination of the Molar Volume of Nitrogen, as typically presented to the first-year student.

As suggested by the figure, there are enough pitfalls in the setup of the apparatus that individual students could report a wide variety of experiences with this experiment, falling somewhere between the following two hypothetical cases:

At the beginning we had to measure out two heaps of white powder, said to be ‘volatile should they touch then combine inadvertently with water’. That made us nervous. Also, during the setup of the apparatus we were instructed to blow air through Tube A to induce siphoning of the water in Tube B, and I thought that was unsanitary, possibly toxic. Later, I swirled my acid solution just a bit too well and the stopper flew out from under my thumb. They sent me back to the balance room to start over. (But I was lucky compared to my neighbor who, when my flask exploded, dropped his flask and cracked the vial inside, so then he had to learn about fees for broken glassware as well.) Finally, some invisible gas pushed water through Tube B into an empty beaker. So this was not only the most nerve-wracking of all our first-year experiments but also the most anticlimactic, to boot. Also, I was so flustered by the setup and by the time-sensitive procedure that I failed to obtain good data, so I just made up a reasonable looking result, 21.6 L/mol, and reverse engineered all the rest to match it. That part was fun, at least: putting one over on the instructor.

Student X (hypothetical)

This was easily our most memorable experiment, starting with the enchanted adjective ‘sulfamic’ and ending with liquid rising magically in a beaker as proxy for the nitrogen evolving in the flask. The number-crunching phase of the work I saved, like a dessert, for the following Sunday, excited to see if my result would be close to the modern value, 22.4 L/mol. (It was.) To fix it in my memory, I nicknamed this unusual experiment the Church of Nitrogen.

Student Y (hypothetical)

In short, it’s a case where one might argue that too much is happening already, leaving one, atypically, with no inclination to shift the line to the left. To the contrary, one might advocate moving the line all the way to the right, which is to say: replacing the whole thing by a computer simulation so the students could relax and enjoy the show, as it were.

Cautionary notes

Suppose the molar volume experiment is captured in a well-crafted applet (computer simulation). Are we all happy? I’ve not yet seen an applet of this particular experiment, but there is no doubt that many simulations are quite enchanting and edifying, not to say hypnotic in their appeal. However, I would offer a few cautionary notes.

(a) Climate of fear

During much of the twentieth century, the general attitude of the public toward chemistry was love-of-sanitation, in one compartment of the brain, accompanied by a vague chemophobia rarely articulated, lurking in some other compartment of the brain. In the final decades of that century, it seems that the chemophobic note grew much louder, though, having been amped up by the Nanny State and its flip side, the Litigious Society (in the US at least); by raised awareness of food additives and toxic waste, and their impact on health and the environment, respectively; and by the war on drugs (with the result that elemental iodine has been reclassified as a List I substance in the US, and possession of an Erlenmeyer flask now requires a permit in Texas; see Thompson 2008, p. 65, p. 14). Add to that the war on terrorism in this century, with its concern over homemade bombs and bioweapons, and we have, for the foreseeable future, a full-blown Climate of Fear. Thus, chemistry, the household god of sanitation, once harried only by the vaguest of chemophobias, is now in the sights of an all-out assault that seeks to sanitize it.

Given this context, suppose a teacher of basic chemistry enthuses over software simulations (for their own considerable merit). Parents would step forward to praise that teacher for finding a way around the vagaries of glassware, flames, acids, and noxious fumes. But this would be the worst of all reasons for doing computer simulations, i.e., the coincidental fact that they are sufficiently insipid to keep the parents calm and their lawyers at bay. Our hypothetical teacher and the parents who praised her/him would be speaking different languages.

While their motivations are very different, the Climate of Fear and the love of black-boxing for its own sake both lead ultimately to the same blank-eyed push-button dystopia; and the starting point for remedying either malady is the same: heightened awareness of the path we’re on. Thus, trying to find the silver lining, we might even say that the Climate of Fear helps us by raising our general awareness about too many givens (to paraphrase Bauer 1990) and encourages us to deal sooner rather than later with the urge to be always ‘more advanced’ by black-boxing to the nth degree.

(b) If the computer says so it must be true

From the earliest days of data processing, we have all exhibited an unfortunate tendency to ‘believe it if the computer says so’ (giving the machine credit for its infallible computational ability but forgetting that the underlying algorithm is 100 percent vulnerable to human foibles). A computer simulation of a chemical reaction is no exception in this regard, the danger being that while the algorithm might contain errors or misconceptions, its output as a dazzling applet animation would have the psychological effect of presenting Falsehood as the shining untouchable Truth under a coat of polyurethane. The one specific example that comes to mind is the fairy-tale of electrons ‘flowing through’ the wires of an electrolytic cell, implicitly near light speed (when in fact electrons move through wire at a literal snail’s pace called drift velocity, a term that is defined in every physics textbook but is swept under the carpet in chemistry textbooks). The concern is that in a computer simulation, this bit of electrochemical fantasy looks even more plausible and persuasive than in its ubiquitous cartoon form in present day textbooks.

I mention this downside to computer simulation only in passing, for the sake of completeness. In the larger picture, I believe the relation of computer simulations to the Climate of Fear is the more serious concern. (On balance, when compared to mathematics or physics for example, chemistry education seems to me to have relatively few myths or neuroses with the potential for magnification and perpetuation by a computer simulation. Note that the sole example I thought worth citing in this section stands on the interface between chemistry and physics.)Footnote 12

(c) No free lunch

From the standpoint of chemistry-as-ontology, one hopes to see more individuals having experiences similar to that of Student Y (in the section “The ‘church of nitrogen’ or ‘my glassware catastrophe’?”), and fewer walking away so disgruntled and cynical as Student X. It seems that using a computer simulation might shift the balance in the desired direction, since it gives the student a chance to stand back and think about the aim of the experiment rather getting caught up in its logistics. But there is a price to pay for choosing this path to a happier Student X: Let’s say that ‘on average, the situation has improved’, but this means that for certain individuals (Student Y, for instance) it has been a step in the wrong direction, with the experience significantly degraded. In short, we gain a few bad students but at the cost of losing some good ones—including one or two who might otherwise have found their way to becoming chemist-philosophers. Not a pretty picture.

The philosopher/artisan question

If we wish to speak of ontological attitudes among all working chemists as a group, a problem soon arises: Who will actually profess to holding an ontological attitude? Who might prefer to say, ‘I am an artisan with both feet on the ground,’ thus distancing oneself from all such metaphysical airiness?

This difficulty is illustrated in the dialogue between two of the contributors to Morris (2002): In concluding his story of the substance-structure paradigm shift, Schummer reiterates the ontological note on which his paper begins: “Spectroscopic instrumentation… is simply a tool [and as such, its role in the substance-to-structure shift] is going to challenge chemists to examine their ontological attitudes” (Schummer 2002, p. 207). Despite Schummer’s prudence in using the plural, Laszlo quotes him back in the singular, suggesting that we query “whether chemists… have an ‘ontological attitude’” (Laszlo 2002, p. 174; ital. added). This stacks the deck slightly in Laszlo’s favor, allowing him (Laszlo) to oversimplify as follows: “To the contrary, chemists pride themselves as artisans [whose] observation and experimental skill take precedence over theory and metaphysics” (Laszlo 2002, p. 174, ital. added).

In Fig. 5a, we use labels ‘C’ and ‘D’ to represent the two sides of that debate, except that here our intention is to depict the two viewpoints as coexisting in a single individual, on a sliding scale. I.e., it is no longer an either/or question; rather, how much is this hypothetical person motivated by ‘C’ and how much by ‘D’? And similarly for labels ‘A’ and ‘B’ which represent two other viewpoints, likewise coexisting in a single individual, now taken from the student population.

Fig. 5
figure 5

Revisiting the question of ‘ontological attitudes’. a In this arrangement, students associate ontology with the tangibles side of the scale (province of the white-boxing ethos) and professionals associate ontology with the givens side of the scale (where high levels abstraction allow big issues such as substance vs. structure to be worked out). Thus, there seems to be a paradox. But…. b By placing all four subjective notions in a single continuum running from ‘reactionary’ to ‘progressive’, and then by annotating A/B and C/D, we resolve the seeming paradox. White box Contemplative (chemistry as part of philosophy); Shaded box Practical (chemistry as an applied science). Tangibles/givens scale same meaning here as in previous graphics, but flipped left/right to bring it into alignment with the scale described next. Reactionary/progressive scale movement to the right represents more black-boxing and more ‘chunking’ (to borrow the language of Hofstadter 1979, pp. 285–288). From this perspective, movement to the left seems reactionary or anti-‘progress’. For other perspectives on this, see A/B below, and the ensuing text. A/B We imagine these two attitudes coexisting in a single individual. Having depicted them first in a 50/50 proportion above, we refine the picture now to allow any proportion, e.g. . For the person depicted here, A is the primary interest with B ranked a distant second. See text for further discussion of how the graph might be interpreted. C/D Again, the two attitudes are now imagined to occur in any proportion, e.g. . For the person depicted here, biochemistry is primarily about ontology. The idea of being an artisan is acknowledged, but given less weight

Near the bottom of Fig. 5, the graph illustrating ‘A/B’ could be fleshed out in a number of ways. For instance, it might represent a student whose interest in chemistry was founded on an atomocentric viewpoint. My arbitrary name for such a philosophy is ‘Fuzzy CH4 Ontology’, the implication being that for some of us the notion of ‘ontology’ itself makes more sense if worked out on a sliding scale as distinct from a binary scheme of yes/no states.

Here is part of the rationale: Suppose one makes a good-faith effort to encompass in one’s ontology all the quasi-molecular species (such as the H3O+ ions that not only occur in trace quantities in ‘pure water’ but are widely recognized as an emergent property of water itself, forcing us to revise our general notion of what a substance is). Given the resultant complexity, one will likely conclude that “the costs of a richer ontology are a fragmentation of chemical knowledge” (Schummer 1998), p. 16. By rough analogy, what I am saying here is: The costs of a richer ontology are a loss of chemical sense.

Accordingly, my atomocentric outlook takes the atom as a reference point (only), and gives consideration to larger entities, but in a limited way. From this viewpoint, the existence of molecules is acknowledged as ‘important’ but only to a degree. Up to what point? I don’t envision any particular cutoff point; rather, a sliding scale as follows: The larger and more complex something is (relative to the atom), the less real it is. Thus, ‘reality’ has no sharp border (as in an atomistic philosophy) but fades out gradually. (And since this approach bears a certain resemblance to fuzzy logic for computers, I use the term ‘Fuzzy’ in its name.)

Here are a few examples of how the idea would be applied:

  1. 1.

    CBrClFI is real (if whimsied) but not as real as CH3Br which is not as real as CH4 which is not as real as elemental carbon and hydrogen in isolation.

  2. 2.

    HgO is almost as real as elemental Hg and O: Just apply heat (350°C) to form the compound. When? We can do it today or a billion years from today. Thus, we feel no need to specify a separate ontological status for mercury(II) oxide as such. Its existence is already implied by Hg and O. (As sketched out here at least, ours would be a simplistic philosophy in the sense that it shows no interest in trying to enumerate, say, the seven valence isomers of C6H6 as a subset of the 217 possible configurations, which in turn are a subset of an ‘n-factorial’ blizzard of theoretical possibilities, as alluded to in Del Re 1998, pp. 6–7.)

  3. 3.

    Enter the human, glacially slow and yet—by the cruelest of paradoxes—sadly ephemeral too. And juxtaposed with that biped, a single atom of, say, sulfur or cadmium or lead, scintillating with graceful movement, serene in its prospect of eternity, and with intimations of an intelligence far greater than the human’s, lodged deep within its mechanism. Which is more real, then, the biped contraption puzzling over Turing tapes of its own viral blueprint, or this single vibrant atom of lead that actually seems to know something? To my way of thinking, there is no contest. I place my bet on the lead.

Conclusion

We began by renaming a familiar dichotomy (of the intellect and the hand) only for the sake of playing with it, to see what it might mean to slide its boundary line all the way left to obtain all tangibles (in the section “Redefining the laboratory experience in terms of givens and tangibles ”) or all the way to the right to obtain all givens (in the section “Arguments pro and contra computer simulation”).

Given the Climate of Fear (section “Cautionary notes”), we recognized the sliding scale as a tool for navigation and for building a new value system. Also, in “The philosopher/artisan question”, we found it useful for revisiting Schummer (2002) and Laszlo (2002) in a new light, moving from an all-or-nothing stance toward coexistence. Finally, on a less serious note, we tried to construct yet another kind of sliding scale, this time aimed at ontology itself.

Echoing Bauer (1990), our overall tone has been strongly in favor of more white-boxing, a kind of back-to-basics ethos, lest the spirit of science itself be lost amid all the convenient gadgetry, the issue being: ‘Are we happy to be a technological society? Hadn’t we meant rather to become a scientific society?’ To balance that viewpoint, I’ll conclude with an acknowledgment that in the laboratory, as in life, there is sometimes ‘no going home again’. I’m thinking of the account of a visit paid by Robert Crease and Charles Mann to the physicist Samuel Devons. The two writers arrive eager to try repeating, in Devons’ laboratory, Rutherford’s famous alpha particle experiment. Visibly struggling not to laugh, their host tries to let them down gently, as he ticks off the reasons why this experiment is quite literally unrepeatable today: above all, our ignorance of Rutherford’s experimental craft; also, the question of impatience with a weak source of radiation versus prohibitions against using a source strong enough not to try one’s patience; and so on. Laughing at themselves later, they offer us an analogy: “Could you kindly help me make a Stradivarius?” (Crease and Mann 1986, pp. 337–338) No doubt this cautionary tale about an iconic physics experiment would pertain equally to a dream of following literally in the footsteps of, say, Cannizzaro or Lavoisier.