Skip to main content

Uncertainty of Measurement and Statistical Process Control

  • Chapter
  • First Online:
Machine Tool Metrology
  • 3262 Accesses

Abstract

With the dramatic effect of globalisation of trade in the more recent decades, this has refined the world’s industrial focus on demands for the conformance of components to exacting customer specifications. If a manufacturing or metrological problem in customer–supplier disagreements occurs—as to a product’s-compliance—then this can be exceedingly costly to all of the interested parties involved.

Fere libenter homines id quod

volunt credunt.

TRANSLATION:

Men are nearly always willing to

believe what they wish.

Gaius Julius Caesar

[100–44 BC].

(In: Book 3, Sect. 18; By—De Bello Gallico, 51 BC).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 249.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Geometric Dimensioning and Tolerancing (GD&T) (There are several international standards available worldwide that describe the symbols and define the rules utilised in typical GD&T. One such well-recognised international standard has been produced by The American Society of Mechanical Engineers, this being: (ASME) Y14.5-2009): that is, a system for defining and communicating engineering tolerances. Consequently, this Geometric Dimensioning and Tolerancing (GD&T) utilises an internationally recognised form of symbolic language on engineering drawings and computer-generated three-dimensional solid models, which explicitly describes the nominal geometry and its allowable variation.

  2. 2.

    Measurement uncertainty: in simplistic terms in dimensional metrology, it can be said to be “A non-negative parameter characterising the dispersion of the values attributed to a measured quantity”. This potential uncertainty has a probabilistic basis that reflects incomplete knowledge of the measured quantity. Here, all measurements are subject to degree of uncertainty, and a measured value is only complete if it is accompanied by a “Statement of the associated uncertainty”. Relative uncertainty is the term obtained from the actual measurement uncertainty divided by the measured value.

  3. 3.

    DMIS (i.e. Dimensional Measuring Interface Standard) (ISO 22093:2011 is where this standard defines “…a neutral language for communication…” between information systems and DME—thus, it is known simply as DMIS. For that reason, this DMIS is an execution language for measurement part programmes and provides an exchange format for metrology data such as for features, tolerances, as well as the measurement results); and is the definitive standard for communications of dimensional measurement programme sequences and results for manufacturing inspection. Accordingly, this dimensional measuring interface standard (DMIS) is widely utilised within CMMs, this being either as an intermediate file format between a CAD system and that of the CMM’s bespoke proprietary inspection language, or as a specific programming language for direct control of the actual CMM.

  4. 4.

    Error Budget: thus, a very basic but suitably valid definition might be “Simply a list of all error sources and their effect on component part accuracy” (Adapted from: Precision Manufacturing—by D.A. Dornfeld and D.-E. Lee, Springer Sci. Pub., 2008).

  5. 5.

    ISO 22093: 2011Industrial automation systems and integration—Physical device control—dimensional measuring interface standard (DMIS): this standard defines a neutral language for communication between information systems and dimensional measurement equipment (DME). Thus, DMIS is an execution language for measurement part programmes and provides an exchange format for metrology data such as features, tolerances, as well as for measurement results. DMIS conveys the product and equipment definitions along with the process and reporting information necessary to perform dimensional measurements which employ coordinate metrology. DMIS contains product definitions for nominal features, feature constructions, dimensional and geometric tolerances, functional datums and also for part coordinate systems. It also communicates equipment definitions for various measurement sensors, measurement resources and machine parameters. Accordingly, DMIS instructs the DME’s motions and measurements for product acceptance, or verification and for manufacturing process validation and control. Furthermore, DMIS guides the analysis of coordinate data to report and tag measurement results that ascertain product/process quality. While finally, to aid in its implementation, application functional subsets of DMIS, it has been defined to ensure that successful interoperability and it can also validate DMIS conformance, while furthermore, DMIS addresses the associativity of DMIS product definitions with computer-aided design (CAD) information. NB Also see Footnote 3, for more information.

  6. 6.

    Tool drift: this is a well-documented stochastic phenomenon, which occurs whilst machining, for example, a medium-to-large-sized batch of parts—see Fig. 7.10 (top)—due to its stochastic and tribological nature that is present during any type of machining operations, such as when an external turning operation is undertaken. Here, the flank insert wear on the cutting tool increases, which means the workpiece diameter will also steadily and proportionally increase—with the actual time in cut. This action necessitates that the tooling being utilised will need periodic re-calibration/adjustment, otherwise tight workpiece tolerances could be compromised—due to this incipient drifting of dimensional tolerances (see Fig. 7.11a). Of particular note is that a drilled hole produces the opposite effect. In this instance, a batch of the drilled hole diameters gets smaller, as a result of the continuous wearing on the drill’s margin, or perhaps due to a boring tool’s insert increasing wear rate—when undertaking such boring operations—in batches of similar parts (Sources: Smith 1989, 2001, 2008, etc.).

  7. 7.

    See the schematic representation of typical limits and fits for a characteristic hole and shaft—for suitable mating parts, shown in Fig. 7.6 (top), according to ISO 286—Parts 1 and 2 , and ANSI B4.2 (1978). Without the above considerations, one might need to read and comprehend the information on the calibration certificate , or written specification for a particular test/measurement. However, it is worth re-emphasising that a measurement is not traceable unless it is quoted with a valid figure of uncertainty.

  8. 8.

    Arithmetic mean—this term being denoted by the symbol ‘\( \overline{x} \)’ (i.e. stated as x-bar): invariably it is utilised in both Mathematics and Statistics. The arithmetic mean can be considered as “The sum of a collection of numbers divided by the quantity of numbers in that collection”. Furthermore, this mean may often be confused with the median, mode or even range. Thus as just described, but mentioned here slightly differently, such that the mean is often stated as the “Arithmetic average of a set of values, or distribution”. However, for skewed distributions, the mean is not necessarily the same as the middle value (median), or the most likely (mode), consequently the simplified calculation for this arithmetic mean is given by

    Arithmetic mean\( \overline{x} \)’:

    $$ \overline{x} = \frac{{\sum \left( {x_{1} \ldots x_{n} } \right)}}{n} $$

    where ‘n’ is the number of observations.

  9. 9.

    Standard deviation (SD) represented by the Greek letter sigma (‘σ’), in both Statistics and Probability theory , where it shows how much variation or dispersion from the average exists. Thus, a low standard deviation indicates that the data points tend to be very close to the mean—also termed expected value—whereas a high standard deviation indicates that the data points are spread out over a large range of values. The standard deviation of a random variable, statistical population, data set or probability distribution is the “Square root of its variance”. NB An inflection point—where ‘σ’ occurs—is a position where a Gaussian curve changes its curvature from say, convex to concave—at its point of tangency. Here, the random variable that is normally distributed will have its associated mean (‘\( \overline{x} \)’) and standard deviation (‘σ’). An unbiased estimator for the variance is given by applying Bessel’s correction (Friedrich Wilhelm Bessel (Born: 22 July 1784 in: Minden, Minden-Ravensberg, died: 17 March 1846 in Königsberg, Prussia, now Russia). He was a German Astronomer, Mathematician (i.e. being known as the Systematiser of these Bessel functions). It is worth just mentioning as an interesting aside that Bessel was the first astronomer to determine the distance from our Sun to another Star—by the method of parallax. Of particular note was that these types of functions were originally discovered by Daniel Bernoulli (Born: 8 February 1700; died: 17 March 1782). Bernoulli was a notable Swiss Mathematician and Physicist; he was principally known for the Bernoulli principle, also for his early work on Kinetic theory of gases and, for his excellent research work in the field of Thermodynamics), utilising ‘N  1’ instead of ‘N’ to yield the Unbiased sample variance, which is denoted by ‘s 2’, thus

    $$ s^{2} = \frac{1}{N - 1}\sum\limits_{i = 1}^{N} {\left( {x_{i} - \overline{x} } \right)^{2} .} $$

    Although this estimator is unbiased, if the variance exists and the sample values are drawn independently with replacement, then the term ‘N  1’ corresponds to the number of degrees of freedom in the vector of residuals \( \left( {x_{1} - \overline{x} , \ldots x_{n} - \overline{x} } \right) \). So, by taking square roots it reintroduces bias, and then yields the Corrected sample standard deviation, denoted by ‘s’—which is often termed the estimated standard deviation, thus \( s = \sqrt {\frac{1}{N - 1}\sum\limits_{i = 1}^{N} {\left( {x_{i} - \overline{x} } \right)^{2} .} } \)

  10. 10.

    Process capability index: thus, in case of process improvement efforts, this process capability index ‘C p’, or alternatively, its process capability ratio, is a statistical measure of process capability, namely, the ability of a process to produce output within specification limits. The concept of process capability only holds meaning for processes that are in the state of statistical control. These process capability indices can actually measure how much natural variation a process experiences—relative to its specification limits—and allows different processes to be compared with respect to how well an organisation, or a company, can control them, during a particular batch, or volume production operation.

  11. 11.

    Relative Precision Index (RPI): it is perhaps one of the oldest indexes which was utilised for production output capability. It is based on the ratio of the mean range of samples within the tolerance band. In order to avoid the production of defective material, the specification width must be greater than the process variation; hence,

    $$ \begin{array}{*{20}c} {} & {2T > 6\sigma } \\ {\text{Knowing that:}} & {\sigma = \overline{R} /d_{n} { = }\frac{\text{Mean of sample ranges}}{\text{Hartley's Constant}}} \\ \end{array} $$

    [H. O. Hartley (Born as Herman Otto Hirschfeld (1912–1980)), but more commonly called ‘HOH’. In fact, Hartley was a German-American statistician, who developed and then refined the so-called Hartley’s test —for the Equality of variances (i.e. in 1950)]

    $$ \begin{array}{*{20}c} {\text{So:}} & {2T > {{6\overline{R} } \mathord{\left/ {\vphantom {{6\overline{R} } {d_{n} }}} \right. \kern-0pt} {d_{n} }}} \\ {\text{Therefore:}} & {{{2T} \mathord{\left/ {\vphantom {{2T} {\overline{R} }}} \right. \kern-0pt} {\overline{R} }} > {6 \mathord{\left/ {\vphantom {6 {d_{n} }}} \right. \kern-0pt} {d_{n} }}} \\ \end{array} $$

    Thus, \( {{2T} \mathord{\left/ {\vphantom {{2T} {\overline{R} }}} \right. \kern-0pt} {\overline{R} }} \) is known as the RPI and the value of 6/d n is the minimum RPI, to avoid a generation of material outside the specification limit.

    [Sources: Oakland 1989/2003].

  12. 12.

    Shewhart charts (Walter Andrew Shewhart (Born: 18 March 1891, New Canton, Illinoi—died: 11 March 1967): he was a distinguished American Physicist, Engineer and Statistician. W.A. Shewhart is invariably known as “The Father of Statistical Quality Control”. Dr. Shewhart was awarded a Doctorate in Physics from The University of California, Berkeley, in 1917, where he created the basis for the control chart and the concept of a state of statistical control by carefully designed experiments. While Dr. Shewhart drew on his knowledge from pure mathematical/statistical theories, he understood that data from physical processes never produce a normal distribution curve (i.e. Gaussian distribution). Moreover, he also went on to discover that observed variation in manufacturing data did not always behave in the same manner as data in nature (i.e. often termed as the Brownian motion of particles). Thus, Dr Shewhart concluded that while every process displays some form of variation, with certain processes displaying controlled variation—that is natural to the process—while still others displayed uncontrolled variation—that is not present in the process causal system at all times. NB (Also see Footnote 16, for more specific details, concerning Dr Shewhart’s life and work) (X -bar and R and S Control Charts): accordingly, in statistical quality control (SPC), the individual/moving range chart is a type of control chart being widely utilised to monitor variable data from an industrial process, for which it is impractical to use rational subgroups. The chart is necessary in the following situations: where automation allows inspection of each unit, so rational subgrouping has less benefit, where production is slow, so that waiting for enough samples to make rational subgroup unacceptably delays its subsequent monitoring; for processes that produce homogeneous batches where repeat measurements vary primarily because of measurement error. Thus, this chart configuration actually consists of a pair of charts; these are (i) the individuals chart—for averages, which displays the individual measured values; (ii) the moving range chart, which displays the difference from one point to the next. As with some other quality control charts, these two charts enable the user to monitor a manufacturing process for shifts in the process that could potentially alter the mean, or variance of the measured statistic—see Fig. 7.8.

  13. 13.

    An often quoted analogy of the overall measurement problem, when attempting an inspection procedure, this being somewhat comically and expressly described as being: “To attempt to measure the size of an ice cube, in a warm room!”

  14. 14.

    If an instrument has not been previously calibrated, then its condition will exacerbate any future measurements taken.

  15. 15.

    As examples of these potential sampling issues, any temperature measurement data should be assessed close to the item being monitored and certainly not away from its local thermal environment. When selecting samples from the production line, always ensure that they do not originate in the initial start-up batch run, nor that samples were obtained from a mixed stillage—noting here in this latter circumstance that samples were taken from two separate production process runs.

  16. 16.

    Dr. Walter Andrew Shewhart (i.e. given here, with more in-depth treatment—also see Footnote 12): (Born: 18 March 1891 in New Canton, Il., USA—died: 11 March 1967 in Troy Hills, NJ., USA) was a significant American Physicist, Engineer and Statistician, invariably known as the “Father of Statistical Quality Control”. He attended the University of Illinois obtaining both a Bachelor’s (1913) and Master’s (1914) degree—in Physics—before being awarded his Doctorate also in Physics from the University of California, at Berkeley—in 1917. Moreover, the original concepts of total quality management (TQM) and that of continuous improvement can be traced back to this former Bell telephone employee. He was one of W. Edwards Deming’s original teachers, where at the time, he critically emphasised the importance of adapting management processes to create profitable situations for both businesses and consumers, promoting the utilisation of his own creation, this being the SPC control chart. At this time, Dr. Shewhart believed that lack of relevant information greatly hindered the efforts of control and management processes within a production environment. So, in order to assist the manager in making scientific, efficient, economical decisions, he then developed these practical statistical process control SPC techniques. Today, many of the modern ideas regarding quality-related issues owe their inspiration to the formative work by him. Furthermore, he also developed the so-called practical technique termed Shewhart Cycle Learning, which combined both creative management thinking with that of statistical analysis. This simple management Shewart cycle contains four continuous steps; these are (i) Plan, (ii) Do, (iii) Study, and (iv) Act. These steps (i.e. most commonly referred to as PDSA cycle), Shewhart believed, would ultimately lead to a total quality improvement. Here, this Shewart cycle developed its philosophical strategy from the notion that the constant evaluation of management practices—as well as the willingness of management to adopt and disregard unsupported ideas—is the key elements to the evolution of a successful commercial enterprise.

  17. 17.

    Continuous improvement or Kaizen was developed in Japan: this being utilised for product improvement, or for change for the best. It refers to philosophy, or for working practices that focus upon continuous improvement of processes in manufacturing, engineering, business management, or indeed for any process. When Kaizen is utilised in the business sense and applied to the workplace, it refers to activities that continually improve all functions, and involves all company employees from the CEO, down to the assembly line workers. Furthermore, Kaizen also applies to processes, such as purchasing and logistics, which will cross its organisational boundaries into the supply chain. By improving standardised activities and processes, Kaizen aims to eliminate waste (i.e. as per the Lean manufacturing philosophy). Historically, Kaizen was first implemented in several of the major Japanese businesses after the Second World War, being influenced in part, by American business and quality management experts who visited the country at that time. Now, Kaizen has since spread throughout the world, and is being implemented in environments outside that of business and productivity environments.

  18. 18.

    These ±3 sigma limits (i.e. see Fig. 7.8), are also practically known as upper/lower action limits (i.e. denoted by UAL and LAL, respectively, and they are nominally set at 0.001 probability, i.e. 1-in-1000 chance of occurrence), whereas the 2 sigma limits, namely the upper/lower warning limits (i.e. denoted by UWL and LWL, respectively, which are nominally set at the 0.025 probability, i.e. 1-in-40 chance of occurrence).

  19. 19.

    Population Density Function (PDF) is utilised in the main, in probability and statistical theory: being the density estimation which is the construction of an estimate, based upon observed data, of an unobservable underlying probability density function. The unobservable density function is thought of as the density according to which a large population is distributed, and the data are usually considered as a random sample from that population. A variety of approaches to density estimation are utilised, including Parzen windows : in Statistics, is the Kernel density estimation (KDE), being a non-parametric method to estimate the PDF of a random variable. Here, the KDE is a fundamental data-smoothing problem where inferences about the population are made, based on a finite-data sample. In some research fields, it is also termed as the Parzen–Rosenblatt window method, after Emanuel Parzen (i.e. Dr Parzen is an American statistician—who has researched into signal detection theory and time series analysis ) and Dr Murray Rosenblatt (i.e. who also is an American statistician—specialising in time series analysis); these active researchers are usually credited with independently creating such Windows—in their current form and there are a range of data clustering techniques , including vector quantization . Furthermore, the most basic form of density estimation is simply a rescaled histogram.

  20. 20.

    Return on Investment (ROI): in business and commerce. The purpose of the return on investment (ROI) metric is to measure, per period, the rates of return on money invested in an economic entity in order to decide whether, or not to undertake an investment. Furthermore, it is also utilised as an indicator to compare different project investments within a project portfolio, and thus the project with best ROI is prioritised. ROI and related metrics can provide a snapshot of profitability, which is adjusted for the size of the investment assets tied up in the enterprise. Here, the ROI is often compared to expected or required rates of return on the money invested. Therefore, the ROI is not Net-present value-adjusted and most text media describe it with a Year 0 investment, but with two-to-three year’s income. Marketing decisions have an obvious potential connection to the numerator of ROI (i.e. profits), but these same decisions often influence assets usage and capital requirements (e.g. receivables and inventories). Marketers should understand the position of their company and the returns expected.

  21. 21.

    50-off sequential component parts from the production process: here, it is normal practise when undertaking a typical capability study to obtain from this production batch, consecutive parts from the machine tool, to effectively stabilise the consistency of the overall production output. Invariably, the first-off part is often disregarded, as at that moment the production process must be stable. Moreover, statistically speaking an odd number in the inspection sample size is normally employed, thereby reducing the likelihood of the so-called flat-topped distribution from occurring; hence invariably, 49-off components are normally assessed in such capability trials—see Fig. 7.13.

  22. 22.

    If the plotted points on the probability paper occur as a straight line relationship, then this will denote that a normal/Gaussian distribution has occurred which would normally to be expected. However, if these plotted points appear as either a concave or convex curve, then it indicates a skewed distribution—either way—indicating that there is a systematic error present, which requires some further investigation, which should immediately stop this machine’s production operation. Of note is that if the plotted points appear as an extended S curve on the plotted probability paper, this normally denotes a bi-modal hump distribution, meaning that either two processes have been accidently merged, or that two tools from the tooling magazine have been employed in this machining trial—possibly having different tool offsets.

  23. 23.

    The difference from ‘\( \overline{x} \)’ to the 34 % line, which equates to 16 %—this being the value of one standard deviation (‘σ’)—see Fig. 7.7b.

References

Journal and Conference Papers

  • Abbe, M. & Takamasu, K., Reliability on calibration CMM, Measurement, 359-368, 2003.

    Google Scholar 

  • Bauman, C., De Heck, J., Leonard, E. & Merrick M., SPC: Basic control charts: theory and construction, sample size, x-bar, r charts, s charts, University of Michigan (USA), retrieved: May, 2014.

    Google Scholar 

  • Bounazef, D., Chabani, S., Idir, A. & Bounazef, M., Management Analysis of Industrial Production Losses by the Design of Experiments, Statistical Process Control, and Capability Indices, Open Journal of Business and Management (OJBM), Vol.2 (1), Jan. 2014.

    Google Scholar 

  • Boyles, R., The Taguchi Capability Index, J. of Quality Technol., Vol. 23 (1), 17-26, American Society for Quality Control Pub. (Milwaukee, Wisconsin), 1991.

    Google Scholar 

  • Breuckmann, B. & Langenbeck, P., In Process Interferometry By Real-Time Fringe-Analysis, Proc. SPIE 1015 - Micromachining Optical Components and Precision Engineering, Vol. 1015, 45, April 11, 1989.

    Google Scholar 

  • Chen, C., Evaluation of measurement uncertainty for thermometers with calibration equations, Accreditation and Quality Assurance, Vol. 11 (1-2), 75-82, April 2006.

    Google Scholar 

  • Cox, M., Forbes, A., Harris, P., & Matthews, C., Numerical Aspects in the Evaluation of Measurement Uncertainty, Uncertainty Quantification in Scientific Computing, IFIP Advances in Information and Communication Technology, Vol. 377, 180-194, 2012.

    Google Scholar 

  • Gibson, P,R. & Hoang, K., Automatic statistical process control of a CNC turning centre using tool offsets and tool change, Int. J. of Adv. Manufact. Technol., Vol. 9 (3), 147-155, 1994.

    Google Scholar 

  • Hryniewicz, O., Application of ISO 3951 Acceptance Sampling Plans to the Inspection by Variables in Statistical Process Control (SPC), Frontiers in Statistical Quality Control, Vol. 6, 80-92, 2001.

    Google Scholar 

  • Kechagioglou, I., Uncertainty and Confidence in Measurement, Dept. of Applied Informatics, University of Macedonia, Ελληνικό Στατιστικό Ινστιτούτο Πρακτικά 18ου Πανελληνίου Συνεδρίου Στατιστικής σελ.441-449, 2005.

    Google Scholar 

  • Knebel. R., Scanning for better results: Scanning CMMs measure production parts more accurately than touch trigger CMMs, American Machinist, Nov. 1999.

    Google Scholar 

  • Kureková, E., Measurement Process Capability – Trends and Approaches, Measurement Science Review, Vol. 1 (1), 2001.

    Google Scholar 

  • Leach, R., Some issues of traceability in the field of surface topography measurement, Wear, Vol. 257 (12), 1246–1249, Dec. 2004.

    Google Scholar 

  • Leach, R., Chetwynd, D., Blunt, L., Haycocks, J., Harris, P., Jackson, K., Oldfield, S. & Reilly, S., Recent advances in traceable nanoscale dimension and force metrology in the UK, Measurement Science and Technology, Vol. 17 (3), 17, 2006.

    Google Scholar 

  • Moona, G., Sharma, R., Kiran, U. & Chaudhary, K.P., Evaluation of Measurement Uncertainty for Absolute Flatness Measurement by Using Fizeau Interferometer with Phase-Shifting Capability, MAPAN – J. of Metrology Society of India, Vol. 29 (4), 261-267, Dec. 2014.

    Google Scholar 

  • Motorcu, A.R. & Güllü, A., Statistical process control in machining, a case study for machine tool capability and process capability, Materials & Design, Vol. 27 (5), 364–372, 2006.

    Google Scholar 

  • Phillips, S.D., Measurement Uncertainty and Traceability Issues: A Standards Activity Update, Precision Engineering Division, The National Institute of Standards and Technology, 2004.

    Google Scholar 

  • Piratelli-Filho, A., CMM uncertainty analysis with factorial design, Precision Engineering, Vol. 27, 283-288, 2003.

    Google Scholar 

  • Potzick, J.E., Accuracy and traceability in dimensional measurements, Proc. SPIE 3332, Metrology, Inspection, and Process Control for Microlithography XII, 471. June 8, 1998.

    Google Scholar 

  • Prajapati, D.R., Implementation of SPC Techniques in Automotive Industry: A Case Study, Int. J. of Emerging Technol. and Adv. Eng’g., Vol. 2 (3), 227, Mar., 2012.

    Google Scholar 

  • Sultana, F., Razive, N.I. & Azeem, A., Implementation of Statistical Process Control (SPC) for Manufacturing Performance Improvement, J. of Mech. Eng’g., Vol. 40 (1), 15-21, 2009.

    Google Scholar 

  • Tannock, J.D.T., Statistical process control software, data collection and computer-aided inspection, Automating Quality Systems, 181-197, 1992.

    Google Scholar 

  • Valdés R. A., Di Giacomo, B & Paziani, F.T., Synthesization of thermally induced errors in coordinate measuring machines, J. Braz. Soc. Mech. Sci. & Eng. Vol. 27 (2), Rio de Janeiro, April/June 2005.

    Google Scholar 

  • Wang,Q., Zissler, N. & Holden, R., Evaluate error sources and uncertainty in large scale measurement systems, Robotics and Computer-Integrated Manufacturing, Vol. 29 (1), 1-11, Feb. 2013.

    Google Scholar 

  • Wells, C.V., Principles and Applications of Measurement Uncertainty Analysis in Research and Calibration, Third Annual Infrared Radiometric Sensor Calibration Symposium September 14-17, 1992, at Utah State University - Logan, Utah, USA, NREL/TP-411-5165, UC Category: 270, DE93000034, 1992.

    Google Scholar 

  • Zhongcheng Y., Uncertainty analysis and variation reduction of three–dimension coordinate metrology. Part 2: uncertainty analysis, Machine Tools & Manufact., Vol. 39, 1219-1238, 1999.

    Google Scholar 

Books, Booklets and Guides

  • ASME B89.7.3.1-2001: Guidelines for Decision Rules: Considering Measurement Uncertainty in Determining Conformance to Specifications.

    Google Scholar 

  • Bell, S., Measurement Good Practice Guide No. 11. A Beginner’s Guide to Uncertainty of Measurement, Tech. Report, by the National Physical Laboratory (UK), 1999.

    Google Scholar 

  • Birch, K., Estimating Uncertainties in Testing - An Intermediate Guide to Estimating and Reporting Uncertainty of Measurement in Testing, Measurement Good Practice Guide No. 36, British Measurement and Testing Association, 2003.

    Google Scholar 

  • Cardelli, F., Materials Handbook, Springer-Verlag London Ltd., 2000.

    Google Scholar 

  • Cox, M. G. & Harris, P. M., Best Practice Guide No. 6, Uncertainty evaluation, Tech. Report DEM-ES-011, National Physical Laboratory (UK), 2006.

    Google Scholar 

  • Cox, M. G. & Harris, P. M., Software specifications for uncertainty evaluation, Tech. Report DEM-ES-010, National Physical Laboratory (UK), 2006.

    Google Scholar 

  • Dieck, R.H., Measurement Uncertainty, Fourth Edition: Methods and Applications, ISA Pub., Dec. 2006.

    Google Scholar 

  • Dornfeld, D.A. & Lee, D-E., Precision Manufacturing, Springer Science Pub., 2008.

    Google Scholar 

  • Geometric Dimensioning and Tolerancing Handbook: Applications, Analysis & Measurement: GDT-HDBK, American Society of Mechanical Engineers (ASME), 2009.

    Google Scholar 

  • Ghahramani, S., Fundamentals of Probability (2nd Ed.), Prentice Hall Pub. (New Jersey), 2000.

    Google Scholar 

  • Grabe, M., Measurement Uncertainties in Science and Technology, Springer Pub., 2005.

    Google Scholar 

  • Griffiths, B., Engineering Drawing for Manufacture, Kogan Page Science Pub., 2003.

    Google Scholar 

  • Grous, A., Applied Metrology for Manufacturing Engineering, ISTE Ltd & John Wiley & Sons, Inc., 2011.

    Google Scholar 

  • Imai, M., Kaizen: The Key to Japan’s Competitive Success, McGraw-Hill/Irwin Pub., 1986.

    Google Scholar 

  • ISO 22093:2011 - Industrial automation systems and integration - Physical device control -Dimensional Measuring Interface Standard (DMIS).

    Google Scholar 

  • Leach, R., Fundamental Principles of Engineering Nanometrology – 2nd Ed., Elseveir Inc., 2014.

    Google Scholar 

  • Measurement Uncertainty, United Kingdom Accreditation Service (UKAS), 2014.

    Google Scholar 

  • Montgomery, D., Introduction to Statistical Quality Control, John Wiley & Sons, Inc. (New York), 2004.

    Google Scholar 

  • Murdoch J. & Barnes, J.A., Statistical Tables – 2nd Ed., Macmillan Press, 1995.

    Google Scholar 

  • Oakland, J.S., Statistical Process Control (5th Ed.), Butterworth-Heinemann Pub., 2003.

    Google Scholar 

  • Salicone, S., Measurement Uncertainty: An Approach via the Mathematical Theory of Evidence (Springer Series in Reliability Engineering), Springer Science Pub., Dec. 2006.

    Google Scholar 

  • Shewhart, W. A., Economic control of quality of manufacturing product, Pub. by: Van Nostrand (New York), 1931.

    Google Scholar 

  • Slocum, A.H., Precision Machine Design, Society of Manufacturing Engineers Pub., 1992.

    Google Scholar 

  • Smith, G.T., Advanced Machining – The Handbook of Cutting Technology, IFS/Springer Verlag Pub., 1989.

    Google Scholar 

  • Smith, G.T., Industrial Metrology – Surface and Roundness, Springer Verlag Pub., 2001.

    Google Scholar 

  • Smith, G.T., Cutting Tool Technology – Industrial Handbook, Springer Verlag Pub., 2008.

    Google Scholar 

  • Statistical Process Control Techniques – An Introduction, in: Statit Software, 2014.

    Google Scholar 

  • Stigler, S.M., The History of Statistics: The Measurement of Uncertainty Before 1900, Belknap Press, Mar. 1990.

    Google Scholar 

  • Stout, K., Quality Control in Automation, Kogan Page Pub. (London), 1985.

    Google Scholar 

  • Study Guide for Certification of Geometric Dimensioning and Tolerancing Professionals (GDTP), American Society of Mechanical Engineers (ASME), 2013.

    Google Scholar 

  • What is Process Capability? - NIST/Sematech Engineering Statistics Handbook, National Institute of Standards and Technology, retrieved May 2014.

    Google Scholar 

  • Wheeler, D.J., Understanding Statistical Process Control (3rd Ed.), SPC Press & Statistical Process Control, Inc, 2013.

    Google Scholar 

  • Willink, R., Measurement Uncertainty and Probability, Cambridge Univ. Press, March 2013.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Graham T. Smith .

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Smith, G.T. (2016). Uncertainty of Measurement and Statistical Process Control. In: Machine Tool Metrology . Springer, Cham. https://doi.org/10.1007/978-3-319-25109-7_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-25109-7_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-25107-3

  • Online ISBN: 978-3-319-25109-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics