1 Introduction

This chapter provides an overview of the quantification of uncertainty with a focus on mineral and energy resources and environmental applications drawing on the work of the author and his co-authors over the past 30 years. Rarely in mining applications do initial estimates reconcile with production—there is almost always some reverse calibration or model revision to achieve an operationally acceptable agreement. This feedback approach can be a useful means of model calibration but the production ‘reality’ is an outcome conditional on the model and data used to make the production decision and may be biased. The resort to post hoc empirical calibration is due partly to insufficient data and partly to inadequate accounting for all sources of uncertainty. This situation will worsen as, increasingly, mineral resources will be extracted from deeper and/or lower grade deposits, which will require new technologies and new types of indirect sampling. In applications such as hydrocarbon extraction, the feedback reconciliation approach is essential because the in-situ variables can never be directly observed; Caers (2011) gives a comprehensive account of uncertainty quantification for these types of application.

The focus here is on geological applications in which the purpose is to extract material, store material or monitor the flow of fluids or contaminants. In these applications, uncertainty arises from two sources of variability: the in-situ variability of the geology and associated quantitative variables and the variability that is generated by applying processes to the in-situ resource. The basic approach is to combine data with a model to make predictions. Such predictions are meaningless unless accompanied by quantitative measures of the uncertainty of the prediction.

The general focus, particularly in mining applications, has been on the uncertainty arising from sparse data and not on uncertainty arising from the model, even though the model is inferred, and its parameters are estimated, from the sparse data. Variability arising from processes applied to the in-situ resource is either quantified in an overly simplistic manner or is ignored. The additional aspect in these and most spatial applications is that variability (and, therefore, uncertainty) is scale-dependent and may be relevant on multiple scales depending on the application.

2 Sources of In-Situ Uncertainty

In the field of uncertainty and probabilistic risk analysis two types of uncertainty are identified: aleatory and epistemic uncertainty (or irreducible and reducible uncertainty). In the generally accepted definitions (e.g., Bedford and Cooke 2001), aleatory uncertainty arises from the inherent variability of a phenomenon and cannot be reduced; epistemic uncertainty arises from incomplete knowledge of the phenomenon and can be reduced by more data, analysis or research. As both types of uncertainty are expressed in terms of probabilities, some authors question the necessity to distinguish between them. Others (e.g. Hora 1996; Winkler 1996) prefer sources of uncertainty rather than types, “the distinction between uncertainties is a matter of choice of scale and is, therefore, mutable.” In the geostatistics context, Matheron (1975, 1976, 1978), notes that the empirical basis of uncertainty is the same in both cases and there is no objective criterion to distinguish them. Journel (1994) gives guidelines for modelling uncertainty on which Srivastava (1994) provides critical comment. However, as Winkler (1996) noted “uncertainty is uncertainty but the distinctions are related to very important practical aspects of modelling and obtaining information”. This is especially so in the applications given here.

A fundamental difference between geological applications and many others is that each occurrence (orebody, karst system) is unique and, apart from measurement error, once a physical sample is taken at a location and the required variable is measured directly from the sample, there is no longer any uncertainty about the value of the variable at that location. The general geostatistical model includes stationarity, which allows for repeated sampling of the same random variable at different locations. In principle (but not in practice), all locations in an orebody could be sampled and aleatory uncertainty would be eliminated. Thus, in these applications aleatory uncertainty is entirely a function of the amount and quality of data. Epistemic uncertainty arises from the assumed or inferred geological model (e.g., type, or style, of mineralisation). In mining applications, at least in terms of a general model, there may be significant epistemic uncertainty during early stages of proving a deposit when geological models are inferred from sparse data. Model uncertainty may persist in later stages in terms of the specific characteristics or parameters of the model.

In some natural resource applications, the variables that define the resource can never be directly observed. For example, in hot dry rock (HDR) enhanced geothermal systems, the variable of interest is the combination of natural and stimulated fractures that form connected networks to extract heat. These fractures, at depths of up to 4.5 km, can never be directly observed or measured; their locations, extents and characteristics can only be inferred from micro-seismic events generated by fracture movement, stimulation and propagation (e.g., Xu and Dowd 2014). In these applications, the detailed model can never be known irrespective of the amount of data available. As mineral resources are extracted from increasingly deeper deposits there will be a move from physical samples, from which variables are directly measured, to sensed proxy variables and a move from traditional mining methods to in-situ recovery. For indirectly sensed variables, the aleatory uncertainty of the required variable (e.g., porosity) is largely due to the quality of the relationship with the directly sensed proxy variable (e.g., acoustic impedance), which could be classified as measurement, or interpretation, error.

Thus, although both sources of in-situ uncertainty in these applications are functions of the amount of data, it is useful to distinguish between them in quantifying uncertainty. Hereafter, epistemic uncertainty is used to mean conceptual or descriptive geological models as well as quantitative parametric models that describe spatial variability and in which parameter values are calculated or inferred from data.

Although epistemic uncertainty is recognised, it is largely ignored in practice. Once a model is assumed or inferred and/or its parameters are inferred or estimated from the available data, all measures of uncertainty are based on the data; in most applications, the model of spatial variability is implicitly assumed to be known with certainty. In other fields, there has been a longstanding recognition of the importance of identifying and quantifying both sources of uncertainty and of propagating them into a complete systems model (e.g., Bedford and Cooke 2001; Helton et al. 2004; Oberkampf et al. 2002, 2004). In natural resource applications, particularly mining, the emphasis has largely been on aleatory uncertainty with implicit acceptance that epistemic uncertainty is negligible. Geostatistical simulation is widely used to quantify the effects of limited data on resource modelling and estimation (aleatory uncertainty) but the model (e.g., variogram, spatial pattern) is generally assumed to be perfectly known (no, or negligible, epistemic uncertainty).

3 Transfer Uncertainty

A further complication in mineral and energy resources is that there are additional significant sources of uncertainty in extraction and processing to produce a final product. To borrow a petroleum industry term these might be called transfer, or process, functions and the associated uncertainties, transfer or process uncertainty. A general approach to integrating this source of uncertainty is to quantify all sources of in-situ uncertainties and propagate them into simulated transfer processes (e.g., blasting, selective loading, transport, mineral processing).

In resource extraction applications, it is useful to distinguish two broad types of process (or transfer) uncertainty:

  1. (1)

    The uncertainty associated with in-situ variables that is propagated into processes applied to them. This might be termed passive in the sense that it does not change spatial variability. An example is the impact of grade uncertainty on mine design, which could be assessed by applying the same design process (e.g., optimal open-pit) to a range of simulated realisations of grades.

  2. (2)

    The uncertainty transferred, or propagated, to in-situ variables by applying processes to them. This might be termed active as the process changes spatial variability. Changes in spatial variability can be predicted by modelling the process. An example is blasting a block of ground from which ore is selected.

4 Consequences of In-Situ Uncertainty

There are broadly two aspects of a geological model used in mineral resource applications: the generic type (e.g., stratiform silver/lead/zinc orebody) and the unique aspects that distinguish a specific orebody within the type (e.g., faulting, folding, degree of spatial continuity and of regularity of orebody boundaries). In general, for mineral deposits the first of these is known with near certainty at a relatively early stage but the distinguishing aspects and the relevant scales on which these aspects occur may not be known until much later. In these applications, the two types of in-situ uncertainty are not independent. The sampling scale (e.g., drilling grid) is determined, or at least significantly informed by, the geological model; the sampling scale determines the data, the spatial variability of which is the aleatory uncertainty; the parameters of the model are estimated by the data.

The Stekenjokk mine in Sweden provides a striking example of the consequences of epistemic uncertainty. Boliden Mineral AB mined this massive copper-zinc-silver orebody from 1976 to 1988 and processed a total of 8 M tonnes of ore. Prior to mine development the drilling grid was 20 m × 20 m and, in places, 20 m × 10 m. Figure 18.1 is an idealised, but typical, vertical cross-section through the orebody showing the drill-hole intersections with the ore. Drilling data were combined with the assumed geological model to generate the estimated orebody boundaries. Figure 18.1 shows the complex, multi-directional folding of ore zones encountered in mining. The practical consequences of these predictions were significant (Hoppe 1978):

Fig. 18.1
figure 1

Interpolation of ore continuity from surface drilling data prior to mine development; adapted from Hoppe (1978)

  • Inappropriate choice of mining methods and mining equipment.

  • Increased ore dilution, mining costs, development and processing provisions.

  • Complications of highly mechanised equipment purchased for a simpler mine.

In principle, the problem could have been resolved by more appropriate sampling but the “appropriateness” of sampling was determined by the assumed geological model. In addition, sampling is constrained by cost (relative to the value of the mined product) and the cost of a drilling grid capable of capturing the folding may well have been prohibitive.

Geological models are only as good as the quality and interpretation of the data and the appropriateness of the scale on which the data are collected. Stekenjokk is an extreme (but not unique) example of epistemic uncertainty that could only be reduced to an acceptable level by more data. However, this observation is somewhat circular: the geological model depends on the amount of data/information available but the data type and collection are informed by the assumed model.

4.1 Scale and Variability Example: Hilton Orebodies Australia

This example is from a study of a complex group of three silver/lead/zinc orebodies at what, at the time, was known as the Hilton mine in north-western Queensland, Australia. The full study is given in Dowd and Scott (1984) with a later study in Dowd et al. (1989).

The Hilton orebodies are 22 km north of Mt Isa, one of the world’s largest stratiform base metal deposits. The Hilton orebodies have a similar diagenesis to the Mt Isa orebodies with mineralisation occurring in the same dolomitic shale. The study was undertaken at the pre-feasibility stage and all original drilling, sampling and interpretation were influenced by 50 year’s mining experience at Mt Isa. Although the Mt Isa and Hilton styles of mineralisation are similar, the Hilton orebodies are structurally more complex and less continuous.

Two test areas were extensively drilled to provide detailed information for a geostatistical study to determine optimal drilling densities for mine planning purposes. The holes were drilled from access drives as fans on cross-sections spaced 10 and 20 m apart. One such cross-section is shown in Fig. 18.2 in which the holes intersect the main 2 orebody footwall lens (2 O/B FW) at approximately 5 m centres. The dark blue outlines in Fig. 18.2 are the orebody boundaries estimated from the drill-hole data on the cross-section and on the cross-sections on either side. In the feasibility stage cost would prohibit such a drilling density over the entire orebody. Given the density of the drilling these estimated boundaries could be regarded as reality on all practical scales.

Fig. 18.2
figure 2

Cross-sectional interpretation based on 5 m drill spacing

The effects of other drilling densities were assessed by removing drill data to create new datasets; e.g., removing every second drill-hole on a cross-section yields a 10 m spacing. Datasets for 5, 10, 20 and 40 m drill spacing were used in the study. Orebody boundaries were estimated for each drilling density and the results were given to mining engineers to design stopes. As an example, the estimated orebody boundaries for 20 m drill spacing is shown in Fig. 18.3. As expected, these boundaries are much smoother (less variable, more continuous) than the “reality” represented by the boundaries estimated from the 5 m spacing dataset. The variability of the boundaries is critical in the choice of mining method: the variability of boundaries and their exact delineation are less critical if a bulk mining method is adopted than if more selective methods are used. The original mining method was cut and fill followed later by sub-level open stoping and bench mining.

Fig. 18.3
figure 3

Cross-sectional interpretation based on 20 m drill spacing

Figure 18.4 shows the 5 m interpolation overlaid on the 20 m interpolation. Taking the 5 m interpolated boundaries as reality, all visible light blue areas represent ore dilution arising from planning and extraction based on the 20 m interpolated boundaries.

Fig. 18.4
figure 4

Overlay of 5 m interpolation on 20 m interpolation

Figure 18.5 shows the 20 m interpolation overlaid on the 5 m interpolation. Again, taking the 5 m boundaries as reality, all visible dark blue areas represent the ore loss arising from planning and extraction based on the 20 m interpolated boundaries. Of course, the perfect selection and the adherence to estimated boundaries during production implied by this exercise are not entirely realistic. However, the impact on the choice of mining method, on the predicted grades and tonnages, and on economic outcomes is real.

Fig. 18.5
figure 5

Overlay of 20 m interpolation on 5 m interpolation. Based on 20 m model, all visible dark blue areas represent ore loss

The outputs from the stope design exercise are summarised in Fig. 18.6 for 5, 10 and 20 m drill spacing. Orebodies 1 and 2 H/W (hanging wall) are mined in a single stope and orebodies 2F/W and 3 are mined in separate stopes. Grades were estimated by kriging and are in metal equivalents of lead (weighted sum of lead, zinc and silver grades); intervals are \( \pm 2\sigma_{K} \) where \( \sigma_{K} \) is the square root of the kriging variance and is used as an index of uncertainty rather than a confidence interval. Taking the 5 m designs as actual boundaries, the stope designs based on 10 and 20 m drilling show the effects of decreasing amounts of data on planned tonnage and average grade.

Fig. 18.6
figure 6

Stope designs with contained tonnages and grades for 5, 10 and 20 m drill spacing for orebodies 1 and 2 HW (left); 2 FW (centre) and 3 (right)

The stope designs are based on the data and interpretations from the respective drilling densities but the grades and tonnages are estimated using all data (5 m drill spacing). Assuming the data from the 5 m drill spacing gives the closest possible quantification of reality on all practical scales then the grade and tonnage of the 10 and 20 m stope designs estimated from all data can be regarded as sufficiently close to the real tonnage and grade that could be recovered from the designs.

The effects of data density on grades and tonnages are summarised in Table 18.1. As an example, using the 20 m drill spacing data to design stope 2 (the high-grade orebody 2 footwall) would increase tonnage by 21.4% and reduce grade by 9.6%. There would an increase in metal tonnage of 9.7% but this would at the cost of mining, hauling and processing the additional ore tonnage.

Table 18.1 Differences in tonnes and grades of stopes compared with 5 m designs

Whilst the effects of data on a specific type of mining are of interest, the more important issue is the effect of the assumed geological model on the choice of mining method. The initial geological model was influenced by the knowledge accumulated over a long period of mining in the neighbouring Mt Isa orebodies. The detailed analysis described here enabled the effects of the greater complexity and less continuity of the Hilton orebodies to be systematically quantified, thereby significantly reducing the impact of epistemic uncertainty and contributing to the selection of the most appropriate mining method and mine design.

5 Quantifying Epistemic Uncertainty

In the Hilton example, geological model uncertainty was addressed at the significant cost of more samples—effectively eliminating the epistemic uncertainty on the operational scale through more data and analysis. With the hindsight of the additional data and analysis, and on the assumption that the test volume is sufficiently representative of the remainder of the orebodies, the epistemic uncertainty associated with various drilling grids could be quantified. This would allow assessment of the value of additional information against the cost of collecting it and/or the operational cost of not collecting it. Stekenjokk is an example of the practical consequences of proceeding with an unacceptable level of epistemic uncertainty.

There is an extensive literature on using Bayesian probability to quantify epistemic uncertainty particularly to combine sources of uncertainty (e.g., Winkler 1981; Sankararaman and Mahadevan 2011) and to incorporate expert knowledge and informed guesses in the form of subjective probabilities. It can be argued that subjective probabilities are used implicitly throughout geostatistical analysis, modelling, estimation and simulation irrespective of the amount of data. Expert knowledge/judgment guides variogram calculation and interpretation, choice of training images, domaining, sample differentiation, choice of estimation or simulation method and validity of outputs. There is, however, a distinction between the explicit subjective probability of informed guesses and possible geological models and the implicit subjectivity in inferring model parameters from quantitative data.

In the remainder of this chapter, a distinction is made between model uncertainty and uncertainty of the parameters of a specific model. Many authors do this although in some cases the former may be a case of the latter e.g., it might be argued (with some difficulty) that Stekenjokk was a matter of incorrect structural parameters (degree of folding). A more convincing argument could be made for the Hilton case—the initial assumed model was a Mt Isa type stratiform orebody and the final agreed version was a more complex and less continuous version of the latter.

In addition to Bayesian approaches, others include evidence theory: Shafer (1976) and Dempster (1968); fuzzy sets: (Zadeh 1965); and possibility theory: Zadeh (1978) and Dubois and Prade (2001). These and other approaches are extensively used to quantify uncertainty in risk analysis and a good coverage of probabilistic risk analysis is given in Bedford and Cooke (2001).

Over the past 30 years, all these approaches have been used to incorporate model uncertainty in geostatistical estimation and simulation and the following list is intended as representative rather than exhaustive. Omre (1987) used Bayesian kriging to include qualified guesses when few data are available; the weight assigned to the guess increases as the amount of data decreases.

Fuzzy kriging has been proposed as a means of including aleatory uncertainty (in the sense of inaccurate or imprecise measurements) and epistemic uncertainty (imprecise variogram parameters) in estimation. Uncertain data will, of course, lead to an uncertain variogram but certain (accurate, error-free) data will not necessarily lead to a certain variogram. Diamond (1989) proposed fuzzy kriging to deal with uncertain or imprecise data. Bardossy et al. (1988, 1990a, b) proposed fuzzy kriging for dealing with both sources of uncertainty but the computational cost hindered its use. More recently, Loquin and Dubois (2010a, b) have developed these approaches in computationally feasible forms. Bandemar and Gebhardt (2000) combine fuzzy kriging with Bayesian incorporation of prior knowledge. Bardossy and Fodor (2004) provide a comprehensive coverage of the use fuzzy set theory to quantify geological uncertainty and consequent risk.

Srivastava (2005) used probabilistic modelling of ore lenses to account for uncertainty in the boundaries of geological domains that constrain grade occurrence. Dowd (1986, 1994) and Dowd et al. (1989) used deterministic and probabilistic methods for the same purpose in estimating and simulating grades.

Verly et al. (2008) quantified geological model uncertainty in a porphyry copper deposit by simulating the four principal characteristics of porphyry models: faults defining fault blocks; faulted rock types within fault blocks; un-faulted intrusive and breccia bodies and alteration and copper grade shells.

Maximum likelihood estimation of spatial model parameters has been widely reported in geostatistical applications: Mardia and Marshall (1984), Kitanidis and Lane (1985), Zimmerman (1989), Dietrich and Osborne (1991) among others. Pardo-Igúzquiza and Dowd (1997a, b, c, 2003, 2013), Dowd and Pardo-Igúzquiza (2002) and Pardo-Igúzquiza et al. (2013) used maximum likelihood estimates of variogram parameters and associated uncertainties to incorporate the effects of model uncertainty in simulation and estimation.

For categorical variables, such as geological shapes and surfaces, multiple point statistics simulation provides a means of specifying possible geological scenarios in the form of alternative training images. Caers (2011) uses different training images to introduce geological model uncertainty into the simulation of oil reservoirs. Park et al. (2013) use history matching to quantify the uncertainty of facies models in the form of alternative training images. Hermans et al. (2014) choose among several geological scenarios in the form of possible training images using geophysical data and Bayes rule to compute the conditional probabilities of the alternative training images given the geophysical data.

With a few notable exceptions, in most mining applications the geological (model) uncertainty from the feasibility stage onwards can be limited to uncertainty in model parameters rather than uncertainty about the general model (e.g., stratiform, vein, disseminated). However, for cases where fundamental (and a priori, unverifiable) assumptions are/must be made about the general model, as in oil and gas applications or applications in which physical processes give rise to the variables (e.g., HDR fracture occurrence and propagation), it is essential to test the sensitivity of these assumptions by reconciling the consistency of outputs (e.g., heat production from a geothermal reservoir) with predicted responses to inputs (e.g., fluid flow through fracture networks). The fundamental difference between these cases and mining applications is that ultimately the latter can be directly observed.

On the assumption that the most important characteristics of the underlying model can be captured in several parameters of a broad model, the uncertainty in the parameter estimates can be quantified by generating a set of parameter values using an appropriate set of rules; simulating the spatial random variable(s) using these parameter values; and repeating this process a sufficiently large number of times. Methods for sampling parameter values include Maximum Likelihood, Bootstrap methods (Olea et al. 2015), Bayesian analysis (Kitanidis 1986) and, in multiple point statistics simulations, Bayesian selection of alternative templates or training images (Park et al. 2013; Hermans et al. 2014) and clustering combined with system responses (Caers 2011).

The following two examples illustrate the use of maximum likelihood in model selection and parameter inference and the propagation of the associated uncertainties into geostatistical simulation for environmental and mining applications.

5.1 Example: Transmissivity Uncertainty

This example is taken from Dowd and Pardo-Igúzquiza (2002). The data are from Gotway (1994) and comprise 41 transmissivity measurements in the Culebra Dolomite formation in New Mexico. The original application was for nuclear waste site assessment, where uncertainty in the groundwater travel time of a particle is assessed through its probability density function, which is estimated by running groundwater flow and transport programs with different transmissivity field inputs. These inputs are generated by conditional simulations of transmissivity.

The data are the logarithms of transmissivity in m2 s−1 and the data locations are shown in Fig. 18.7 together with a histogram of the log-transmissivity data.

Fig. 18.7
figure 7

Data locations (distances in km) and histogram of log transmissivity data

Maximum Likelihood was used to estimate the parameters of an exponential covariance model of the residuals for drift orders 0, 1 and 2. Although drift is a deterministic component of the universal model, in practice the coefficients are estimated from the available data and are thus random variables with the means and standard errors given in Table 18.2 for the optimal (determined by the Akaike information criterion) drift model of order 1: drift (x, y) = β0 + β1 x + β 2 y. The estimated covariance parameters for k = 1 are given in Table 18.3 and the variogram is shown in Fig. 18.8.

Table 18.2 Maximum likelihood estimates of drift coefficients
Table 18.3 ML estimates of range and sill: exponential covariance
Fig. 18.8
figure 8

Semi-variogram of the residuals for k = 1 and maximum likelihood model fitted: sill 1.28, range 1.99 km (effective range ~6 km)

In this case, as there is no nugget variance, the range and sill are estimated independently. The correlation between range and sill is thus zero and any combination of values of the two parameters inside their respective intervals is inside the 95% confidence region as shown in Fig. 18.9a. The drift coefficients are also independent of the sill and the range. As the estimated drift coefficients are correlated, not every combination of the three parameter values is equally reliable, i.e. values inside the 95% confidence interval of the parameters taken together may not be inside the 95% confidence interval for each individual parameter. The confidence interval is an ellipsoid. Figure 18.9b shows the 95% confidence region for (β1, β2) when the third coefficient the model is set to the estimated value given in Table 18.3.

Fig. 18.9
figure 9

a (left) 95% confidence region for sill and range; b (right) confidence region for drift parameters β1 and β2 with β0 = −1.6062

The effects of model uncertainty on simulation outputs are illustrated by generating six simulations for each pair of values A, B, C, D and E in Fig. 18.9; each set of simulations was started with the same random number seed. The simulations are shown in Fig. 18.10. The differences between corresponding simulations (e.g., first simulation in each of A, B, C, D and E) for the five sets of parameters reflect the model uncertainty, which could be quantified further by simulating flow and transport through the simulated transmissivity realisations.

Fig. 18.10
figure 10

Outputs from six simulations using the variance and range parameters denoted by the mean values A and the extreme values B, C, D and E in Fig. 18.9

5.2 Example: Coal Resource Risk Assessment

One of the most significant contributors to the total risk in the evaluation of coal-mining projects is the uncertainty of the resource tonnage and quality characteristics, often called the resource risk. This example is from the As Pontes deposit in Galicia, Spain (Pardo-Igúzquiza et al. 2013). The most significant variable in the assessment of resource uncertainty is the thickness of the coal seam. Figure 18.11 shows the data locations at which seam thickness is measured together with the estimated variogram values and the manually fitted (isotropic) variogram model.

Fig. 18.11
figure 11

(Left) drill-hole locations and boundary of the study area. (Right) Variogram and manually fitted model for seam thickness

Spherical model variograms for seam thickness:

  • \( \begin{array}{*{20}c} {{\text{Manual}}\,{\text{fitting:}}} & {a = 4128\,{\text{m}},C_{0} = 3\,{\text{m}}^{2} \,\,{\text{and}}\,C = 25\,{\text{m}}^{2}. } \\ \end{array} \)

  • \( \begin{array}{*{20}c} {{\text{Maximum}}\,{\text{Likelihood:}}} & {a = 4460\,{\text{m}},C_{0} = 4\,{\text{m}}^{2} \,\,{\text{and}}\,\,C = 23\,{\text{m}}^{2}. } \\ \end{array} \)

Although the maximum likelihood estimates of the parameters are very similar to those estimated by visual fitting, maximum likelihood has the advantage of providing estimates of the uncertainty of the parameters. For illustrative purposes, resources were computed as tonnage from panels with thickness above a threshold defined by the 25th percentile of the sample data and equal to a thickness of 8.65 m. The kriged resource volume is 1.97 × 108 m3.

Sequential Gaussian simulation was used to generate realisations of the thickness of the seam. To quantify the uncertainty in the estimated resource, a total of 870 simulations were generated using the ‘certain’ variogram (maximum likelihood parameters) and the total resource was calculated for each simulation. The histogram of the 870 simulated resources quantifies the uncertainty of the estimated resources. An example simulation is shown in Fig. 18.12.

Fig. 18.12
figure 12

Conditionally simulated realisation of coal seam thickness

The parameter space {r0, a, σ2} comprising respectively the nugget/variance ratio, range and variance, is used to quantify the uncertainty in the model. The parameter values were divided into discrete steps of 0.05 for r0 in the interval [0, 1]; 700 m for a in the interval [1,000, 15,000] and 0.1 for σ2 in the interval [0.6, 2.6]. There are 268 models of triplets \( \left\{ {r_{0} , a, \sigma^{2} } \right\} \) that lie inside the 75% confidence region. As these models are not equally probable, the probabilities are normalised so that they sum to 1.0 and each model is included as many times as indicated by its normalised probability (i.e., probability sampling in which, for example, a model with a normalised probability of 0.35 comprises 35% of the total simulated triplets). A total of 870 simulations were used.

Histograms of the total resources for the 870 simulations, with and without the uncertainty of the variogram model parameters, are given in Fig. 18.13. There is no significant difference in mean resource values for the certain and uncertain values.

Fig. 18.13
figure 13

Histograms of total resources calculated by geostatistical simulation assuming the variogram model parameters are known with certainty (solid line) and including the uncertainty of the semi-variogram model parameters (dashed line)

The 95% confidence interval for the total resource assuming the variogram is known with certainty is [1.88 × 108, 2.19 × 108] m3 and [1.90 × 108, 2.23 × 108] m3, when the uncertainty of the variogram model is included. The latter is slightly higher than the same interval calculated under the assumption that the variogram is known with certainty. However, the probability that the total resource will be greater than 2.0 × 108 m3, is 0.59 when the uncertainty of the variogram parameters is ignored and 0.75 when the uncertainty of the variogram parameters is propagated into the simulated realisations. In other words, whilst there is no significant difference in the mean resource for the two sets of simulations, the difference in the two distributions (because of different variances) is sufficient to generate significantly different resource estimates above selected cut-offs.

In this case, the differences in the total volume of resources, with and without quantification of semi-variogram uncertainty, are small but the consequence of selecting from the distribution of possible resources is significant. This illustrates a general principle: the estimated total resource and the mean simulated resource, with and without semi-variogram uncertainty, may not differ significantly but the distributions of the two simulations will differ because of the different variances. Similarly, selecting panel values above a threshold from the set of estimated panel thicknesses or from a set of simulated panel thicknesses will yield different results.

In general, the outcome from the simulations with and without semi-variogram uncertainty depends on the deposit and the amount of data available. Evaluation of model uncertainty is critical in resource risk assessment even if it is ultimately found that there is no practical difference between resource estimates obtained by ignoring or including semi-variogram uncertainty. This example also has important implications for compliance with resource and reserve reporting codes, most of which use terms such as, or equivalent to, the amount of error [associated with an estimate], the level of accuracy [of an estimate], the level of confidence [in a reserve statement], and levels of geological confidence (words in italics are quoted from JORC 2012). Whilst all reporting codes currently use these terms qualitatively they all have specific quantitative meanings in statistics, probability and risk assessment and are increasingly being referred to explicitly in reporting codes.

6 Quantifying the Effects of Transfer Uncertainty

An example of passive transfer uncertainty is the variation in open-pit size and shape as a function of grade uncertainty as shown in Fig. 18.14 taken from a study of a small gold orebody (Dowd 1995, 1997). The impacts of these types of uncertainty can be quantified by standard applications of geostatistical simulation. Dimitrakopoulos and co-workers have made significant contributions to the integration of in-situ grade and geological uncertainty into optimization algorithms (e.g., Dimitrakopoulos et al. 2002; Goodfellow and Dimitrakopoulos 2013).

Fig. 18.14
figure 14

Optimal open pits generated from 100 simulations of a small gold orebody. Top: maximum volume; centre: median volume; bottom: minimum volume

More challenging is the impact of propagating in-situ uncertainty through the mining (extraction) process. The critical component of most metalliferous open-pit mining operations is ore selection, i.e. the minimisation of ore loss and ore dilution during extraction. In general, extraction comprises drilling, blasting and loading, all of which are planned and designed on uncertain models of local geology and grade. The conversion of the in-situ block model resource to a realistically recoverable reserve may, in many instances, be the most significant source of uncertainty in reserve estimation. The usual assessment of recoverable reserves, for example, is limited to a simple volumetric exercise in which ore recovery is assessed as a function of applying a range of selection volumes to a simulated orebody or an even simpler volume-based adjustment of the variance of estimated block values. These simplistic approaches ignore the practicalities of the mining, selection and loading processes—blast design, behaviour and performance; equipment type, size and operation; ore displacement during blasting and loading; and ability to identify ore zones within a blast muck pile. In many applications, the uncertainties introduced by these technical processes are at least as significant as those that derive from the in-situ spatial characteristics of grades and geology.

An approach to quantifying transfer process uncertainty for blasting and loading comprises:

  • generation of an in-situ model of the orebody comprising the grade, geology, geomechanical properties and grade control variables within small volumes determined by the smallest selectable volume within a blast muck-pile;

  • definition of a blast volume comprising a large number of in-situ model volumes, and subjecting it to a blast simulator, which effectively moves each component model volume to its final resting place in the blast muck-pile; and

  • application of simulated selective loading processes to the simulated blast muck-pile to determine the selectivity that can be achieved by various sizes of loader and types of loading and to quantify ore dilution and ore loss.

The in-situ model, representing perfect knowledge at all relevant scales, is obtained by geostatistical simulation. An in-situ model that represents the reality of knowing only the data and information that are available from specific grade control drilling and sampling grids can be obtained by sampling the geostatistically simulated model on a specified grid. The volumes comprising the in-situ model are then populated by estimates based only on the data corresponding to the specified grade-control drilling and sampling grids. Different drilling and sampling grids can be used to generate different models, each reflecting the levels of data and information available. Selectivity can then be assessed as a function of the drilling and sampling grids as well as the size and type of loader. Performance is assessed against the ideal selectivity that can be achieved on the perfect knowledge model, comprising the simulated values of each component volume. Applying costs, prices and financial criteria enables an optimal selection of the grade control drilling grid, size of loader, type of loading and even blast design.

The following case study (Dowd and Dare-Bryan 2004) is based on the Minas de Rio Tinto SAL open-pit copper mine at Rio Tinto, southern Spain, which is typical of a low-grade operation in the later stages of its life. Ore/waste delineation for selective mining is difficult because the head grades are near the economic cut-off grade and there are no clear geological controls on the mineralisation.

Sequential Gaussian simulation, with the blast-hole grades as conditioning data, was used to generate realisations of each mining bench on a block grid of 0.5 m × 0.5 m × 0.5 m, the grid determined based on blast and selection criteria.

The first aspect of predicting recovery is the in-situ heterogeneity of the ore and the extent to which it forms contiguous ‘parcels’ of a size relative to the selection size (capacity and size of loading equipment). The second aspect is the heterogeneity of the ore after it has been subjected to blasting (i.e., the in-situ geological spatial variability and the post-transfer in-situ blast-pile spatial variability).

Figure 18.15 shows horizontal and vertical cross-sections through a simulated bench of dimensions 80 m × 40 m × 12 m (height) simulated copper grades on horizontal planes at the top and bottom of a 12 m bench height and a 6 m mid-plane. The vertical cross-sections of the bench are extremities (0 and 80 m) and intermediate planes at 28 m intervals.

Fig. 18.15
figure 15

a simulated copper grades in a bench: three horizontal sections; b four vertical sections; c blast profile resulting from simulated blast applied to simulated grades; d predicted composition of blast profile from simulated blast applied to in-situ grades estimated from samples taken from blast-holes on 8 m spacing

Figure 18.16 shows the assumed contiguous parcels of ore in the blast pile based on estimated in-situ grade values together with the actual (simulated) parcels of ore. A comparison of the two sets of ore volumes in Fig. 18.16 would quantify ore loss and ore dilution. Blast movement sensors, inserted in drill holes and detected in the blast-pile, are widely used to identify post-blast ore parcels. In such cases, this process would quantify the uncertainty associated with the initial placement of sensors based on estimated in-situ ore locations and a grade continuity model.

Fig. 18.16
figure 16

(Left) selected ore volumes based on estimates (Right) actual ore volumes

Among other examples, Goodfellow and Dimitrakopoulos (2017) describe an approach that integrates sources of uncertainties arising from the combined production of several mines. The in-situ orebody uncertainties are integrated with process uncertainties from extraction to processing to marketing as the basis of modelling and stochastically optimising the value chain of a mining complex.

7 Conclusion

There is a growing requirement for integrated frameworks for uncertainty quantification in all geologically based applications. Quantified uncertainty and geostatistical methods are increasingly being referenced explicitly in mineral resource and reserve codes. This does not require rewriting the reporting codes but it does mean that there is a need to establish a general accepted framework for the quantification of all sources of uncertainty.

Quantified risk assessments for environmental applications are now required in many jurisdictions for applications such as waste burial and the treatment, storage and disposal of radioactive material. These assessments are required to cover time periods that range from around 200 years for household wastes to thousands of years for the underground storage or disposal of radioactive wastes.

The management of groundwater resources, especially karst systems in environmentally vulnerable coastal areas, requires the integration of flow, extraction, seawater intrusion, contamination from agriculture and other activities.

In these and all such applications the identification and quantification of all sources of uncertainty is critical to ensuring reliable estimation, planning, design and, for resource extraction, production and to managing associated risks. As summarised here, many methods and approaches have been developed by many authors but most are limited to aleatory uncertainty.

The work summarised here provides examples of methods that have been successfully applied to identify and quantify all sources of uncertainty in mineral resource and environmental applications. They provide a contribution to the need, and the increasing requirement, to develop integrated frameworks for uncertainty quantification in all geologically based applications.