Skip to main content

Advertisement

Log in

An error model for evaluation and interpretation of proficiency testing

  • General Paper
  • Published:
Accreditation and Quality Assurance Aims and scope Submit manuscript

Abstract 

Data from proficiency testing can be used to increase our knowledge of the performance of populations of laboratories, individual laboratories and different measurement methods. To support the evaluation and interpretation of results from proficiency testing an error model containing different random and systematic components is presented. From a single round of a proficiency testing scheme the total variation in a population of laboratories can be estimated. With results from several rounds the random variation can be separated into a laboratory and time component and for individual laboratories it is then also possible to evaluate stability and bias in relation to the population mean. By comparing results from laboratories using different methods systematic differences between methods may be indicated. By using results from several rounds a systematic difference can be partitioned into two components: a common systematic difference, possibly depending on the level, and a sample-specific component. It is essential to distinguish between these two components as the former may be eliminated by a correction while the latter must be treated as a random component in the evaluation of uncertainty.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Additional information

Received: 20 November 2000 Accepted: 3 January 2001

Rights and permissions

Reprints and permissions

About this article

Cite this article

Nilsson, G. An error model for evaluation and interpretation of proficiency testing. Accred Qual Assur 6, 147–150 (2001). https://doi.org/10.1007/s007690100310

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/s007690100310

Navigation