Skip to main content

Does-It-Work? Metrology for Functionality and Efficacy

  • Chapter
  • First Online:
Executive Decision Synthesis

Part of the book series: Contributions to Management Science ((MANAGEMENT SC.))

  • 483 Accesses

Abstract

We will unpack this question very systematically and rigorously. First, we make clear that this question cannot not be addressed as if discussing a light bulb, a used car, or a radio. To answer “does it work” in a meaningful and thoughtful way, we adopt the approach used by pharmaceutical companies. Demonstrating that a drug works is a stringent process, which is also regulated by unforgiving laws. A drug works if its developers can verify that it is functional. The science must be valid. Then its efficacy must be verified with people. The science and the statistics must be valid. A drug works if and only if both functionality and its efficacy are verified. This the standard we seek. Second, verification requires instruments, a measurement system, and processes that specify how the instruments are to be used and how measurement data are to be analyzed and interpreted. Simply stated, a metrology must exist. We need a metrology for our paradigm. Regrettably, in spite of our best efforts, we are unable to find a metrology for prescriptive decision paradigms. As a result, we developed a metrology and measurement instrument. To our knowledge this is a first in this field and very meaningful contribution. We invite scholars to research this subject and add to the body of knowledge of metrology in the praxis of decision theory.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Notes

  1. 1.

    Henceforth, we will frequently call our paradigm and its methods simply as our “methodology”.

  2. 2.

    Although the propensity, of politicians to do so, is not negligible. For example, the New York Times (3-29-2016) opines “The move to raise the statewide minimum wage would make California a guinea pig in a bold economic experiment.”

  3. 3.

    http://www.eecs.qmul.ac.uk/~andrea/dwnld/SSPE07/AC_SSPE2007.pdf. Downloaded 10 April 2017.

  4. 4.

    Morrow (Morrow 2012, p. 207).

  5. 5.

    Lean acknowledges that necessary work, which does not add to customer value, is indeed pervasive, e.g. filing tax reports.

  6. 6.

    http://www.u-s-history.com/pages/h1624.html. Downloaded 10 April 2017.

References

  • ADI, Inc. (2015). Downloaded January 20, 2016, from http://files.shareholder.com/downloads/ADI/1384072617x0x872073/9B336071-EF60-43AF-9E98-A424EEF6634C/2015_AnalogDevices_AR.FINAL_for_Posting.pdf

  • Arthur, W. B. (2009). The nature of technology. New York: Free Press.

    Google Scholar 

  • Bankes, S. (1993, May). Exploratory modeling for policy analysis. Operations Research, 41(3), 435–449.

    Article  Google Scholar 

  • Baron, J. (2000). Thinking and deciding (3rd ed.). Cambridge: Cambridge University Press.

    Google Scholar 

  • Bazerman, M. H. (2002). Judgment in managerial decision making (5th ed.). New York: Wiley.

    Google Scholar 

  • Bell, D. E., Raiffa, H., & Tversky, A. (Eds.). (1988). Decision making: Descriptive, normative, and prescriptive interactions. Cambridge: Cambridge University Press.

    Google Scholar 

  • BIPM, I., IFCC, I., IUPAC, I., & ISO, O. (2008). The international vocabulary of metrology—basic and general concepts and associated terms (VIM) (3rd ed.). JCGM 200: 2012. JCGM (Joint Committee for Guides in Metrology).

    Google Scholar 

  • Booth, M. (2009). Downloaded January 18, 2016, from www.cs.nott.ac.uk/~pszcah/G53QAT/…/QAT09Report-mxb17u.doc

  • Borsboom, D., & Markus, K. A. (2013). Truth and evidence in validity theory. Journal of Educational Measurement, 50(1), 110–114.

    Article  Google Scholar 

  • Borsboom, D., Mellenbergh, G. J., & van Heerden, J. (2004). The concept of validity. Psychological Review, 111(4), 1061.

    Article  Google Scholar 

  • Brady, J. E., & Allen, T. T. (2006). Six Sigma literature: A review and agenda for future research. Quality and Reliability Engineering International, 22(3), 335–367.

    Article  Google Scholar 

  • Browning, T. R. (2000). Value-based product development: Refocusing lean. In Engineering Management Society, 2000. Proceedings of the 2000 IEEE (pp. 168–172). IEEE.

    Google Scholar 

  • Carlile, P. R. (2004). Transferring, translating, and transforming: An integrative framework for managing knowledge across boundaries. Organization Science, 15(5), 555–568.

    Article  Google Scholar 

  • Chandler, A. D. (2004). Scale and scope: The dynamics of industrial capitalism. New York: Belknap Press.

    Google Scholar 

  • Cherns, A. (1976). The principles of sociotechnical design. Human Relations, 29(8), 783–792.

    Article  Google Scholar 

  • Clegg, C. W. (2000). Sociotechnical principles for system design. Applied Ergonomics, 31(5), 463–477.

    Article  Google Scholar 

  • Cole, R. E. (2002). From continuous improvement to continuous innovation. Total Quality Management, 13(8), 1051–1056.

    Article  Google Scholar 

  • Creveling, C. M., Slutsky, J., & Antis, D. (2002). Design for Six Sigma in technology and product development. Upper Saddle River, NJ: Prentice Hall.

    Google Scholar 

  • DOD 5000. (2002, April 5). Mandatory procedures for major defense acquisition programs (MDAPS) and major automated information systems (MAIS) acquisition programs.

    Google Scholar 

  • Dolan, R. J., & Matthews, J. M. (1993). Maximizing the utility of customer product testing: Beta test design and management. Journal of Product Innovation Management, 10(4), 318–330.

    Article  Google Scholar 

  • DuBos, G. F., Salch, J. H., & Braun, R. (2008). Technology readiness level, schedule risk, and slippage in spacecraft design. Journal of Spacecraft and Rockets, 45(4), 836–842.

    Article  Google Scholar 

  • Dym, C. L., Little, P., & Orwin, E. (2013). Engineering design: A project-based introduction (4th ed.). New York: Wiley.

    Google Scholar 

  • Forrester, A., Sobester, A., & Keane, A. (2008). Engineering design via surrogate modelling: A practical guide. Chichester: Wiley.

    Book  Google Scholar 

  • GAO. (2014, March). Defense acquisitions assessments of selected weapons programs. GAO-14-340SP. Government Accountability Office.

    Google Scholar 

  • Gerstner, L. V., Jr. (2009). Who says elephants can’t dance? Grand Rapids, MI: Zondervan.

    Google Scholar 

  • Goh, T. N. (2002). A strategic assessment of Six Sigma. Quality and Reliability Engineering International, 18(5), 403–410.

    Article  Google Scholar 

  • Golafshani, N. (2003). Understanding reliability and validity in qualitative research. The Qualitative Report, 8(4), 597–606.

    Google Scholar 

  • Graettinger, C. P., Garcia, S., Siviy, J., Schenk, R. J., & Van Syckle, P. J. (2002, September). Using the technology readiness levels scale to support technology management in the DODs ATD/STO environments. Army CEOCOM.

    Google Scholar 

  • Jones, K. (1998). Simulations: Reading for action. Simulation & Gaming, 29, 326–327.

    Article  Google Scholar 

  • Kaplan, C., Clark, R., & Tang, V. (1995). Secrets of software quality: 40 innovations from IBM. New York: McGraw-Hill.

    Google Scholar 

  • Koch, P. N., Yang, R. J., & Gu, L. (2004). Design for Six Sigma through robust optimization. Structural and Multidisciplinary Optimization, 26(3–4), 235–248.

    Article  Google Scholar 

  • Mankins, J. C. (1995, April 6). Technology readiness levels: A white paper. Downloaded February 8, 2009, from http://ipao.larc.nasa.gov/Toolkit/TRL.pdf

  • Mankins, J. C. (2009). Technology readiness assessments: A retrospective. Acta Astronautica, 65(9), 1216–1223.

    Article  Google Scholar 

  • March, J. G. (1997). Understanding how decisions happen in organizations. In Z. Shapira (Ed.), Organizational decision making. Cambridge: Cambridge University Press.

    Google Scholar 

  • Messik, S. (1989). Validity in educational measurement. In T. L. Linn (Ed.), American Council on Education (3rd ed., pp. 13–103). New York: MacMillan.

    Google Scholar 

  • MIT. (2016). 10 Breakthrough technologies 2016. MIT Technology Review, 119(2), 33–67.

    Google Scholar 

  • Morrow, R. (2012). Utilizing the 3Ms of process improvements in health care. Boca Raton, FL: CRC Press.

    Google Scholar 

  • Oppenheim, B. W. (2004). Lean product development flow. Systems Engineering, 7(4).

    Google Scholar 

  • Otto, K. N., & Wood, C. (2001). Product design: Techniques in reverse engineering and new product development. Upper Saddle River, NJ: Prentice Hall.

    Google Scholar 

  • Pahl, G., & Beitz, W. (1999). Engineering design: A systematic approach. London: Springer.

    Google Scholar 

  • Pyzdek, T., & Keller, P. A. (2003). The Six Sigma handbook (Vol. 486). New York, NY: McGraw-Hill.

    Google Scholar 

  • Rittel, H. W., & Webber, M. M. (1973). Dilemmas in a general theory of planning. Policy Sciences, 4(2), 155–169.

    Article  Google Scholar 

  • Rumelt, R. P., Schendel, D., & Teece, D. J. (Eds.). (1994). Fundamental issues in strategy: A research agenda. Boston: Harvard Business Press.

    Google Scholar 

  • Simon, H. A. (2001). The sciences of the artificial (3rd ed.). Cambridge, MA: MIT Press.

    Google Scholar 

  • Sterman, J. D. (2000). Business dynamics: Systems thinking and modeling for a complex world. Boston, MA: Irwin McGraw-Hill.

    Google Scholar 

  • Sterman, J. D., Repenning, N. P., & Kofman, F. (1997). Unanticipated side effects of successful quality programs: Exploring a paradox of organizational improvement. Management Science, 13(4), 503–521.

    Article  Google Scholar 

  • Tang, V., & Otto, K. N. (2009). Multifunctional enterprise readiness: Beyond the policy of build-test-fix cyclic rework. Proceedings of the ASME 2009 International design engineering technical conferences & design theory and design. IDETC/DTM 2009. DETC2009-86740. Aug 30–Sept 2, 2009, San Diego, California

    Google Scholar 

  • Viana, F. A., & Haftka, R. T. (2008, July). Using multiple surrogates for metamodeling. In Proceedings of the 7th ASMO-UK/ISSMO International conference on engineering design optimization (pp. 1–18).

    Google Scholar 

  • Ward, A. C., & Sobek, D. K., II. (2014). Lean product and process development. Cambridge, MA: Lean Enterprise Institute.

    Google Scholar 

  • Weick, K. E. (2001). Making sense of the organization. Oxford: Blackwell.

    Google Scholar 

  • Yang, K., & El-Haik, B. S. (2003). Design for Six Sigma (pp. 184–186). New York: McGraw-Hill.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Appendices

Appendix 4.1 NASA Definition of Technology Readiness Levels

TRL 1

Basic principles observed and reported. Transition from scientific research to applied research. Essential characteristics and behaviors of systems and architectures. Descriptive tools are mathematical formulations or algorithms.

TRL 2

Technology concept and/or application formulated. Applied research. Theory and scientific principles are focused on specific application area to define the concept. Characteristics of the application are described. Analytical tools are developed for simulation or analysis of the application.

TRL 3

Analytical and experimental critical function and/or characteristic proof-of-concept. Proof of concept validation. Active Research and Development (r&d) is initiated with analytical and laboratory studies. Demonstration of technical feasibility using breadboard or brassboard implementations that are exercised with representative data.

TRL 4

Component/subsystem validation in laboratory environment. Standalone prototyping implementation and test. Integration of technology elements. Experiments with full-scale problems or data sets.

TRL 5

System/subsystem/component validation in relevant environment. Thorough testing of prototyping in representative environment. Basic technology elements integrated with reasonably realistic supporting elements. Prototyping implementations conform to target environment and interfaces.

TRL 6

System/subsystem model or prototyping demonstration in a relevant end-to-end environment (ground or space): Prototyping implementations on full-scale realistic problems. Partially integrated with existing systems. Limited documentation available. Engineering feasibility fully demonstrated in actual system application.

TRL 7

System prototyping demonstration in an operational environment (ground or space): System prototyping demonstration in operational environment. System is at or near scale of the operational system, with most functions available for demonstration and test. Well integrated with collateral and ancillary systems. Limited documentation available.

TRL 8

Actual system completed and “mission qualified” through test and demonstration in an operational environment (ground or space). End of system development. Fully integrated with operational hardware and software systems. Most user documentation, training documentation, and maintenance documentation completed. All functionality tested in simulated and operational scenarios. Verification and Validation (V&V) completed.

TRL 9

Actual system “mission proven” through successful mission operations (ground or space). Fully integrated with operational hardware/software systems. Actual system has been thoroughly demonstrated and tested in its operational environment. All documentation completed. Successful operational experience. Sustaining engineering support in place.

Source: https://esto.nasa.gov/files/trl_definitions.pdf. downloaded January 20, 2016

Appendix 4.2 Lean TRL Definitions

L-TRL 1

Basic principles observed and reported. Equations are observed describing the technology physics.

L-TRL 2

Technology concept and/or application formulated. Noise factors identified. Control factors identified. Measurement response identified.

L-TRL 3

Technology performance behavior characterized. Range of control factors identified. Range of noise factors identified. Measurement response identified. Measurement system GRR baselined. Basic concepts demonstrated.

L-TRL 4

Technology Nominal Performance validated. Integration of basic technological components to establish they work together and produce the range of performance targets necessary. Integration uses “ad hoc” hardware in the laboratory. Transfer function equation predicts a validated nominal response. Measurement system GRR complete and capable.

L-TRL 5

Technology Performance Variability validated. Integration of basic technological components with reasonably realistic supporting elements to test technology in a simulated environment. Robustness work on the technology components is complete. The sum of squares response variation impact of each noise factor varying is predicted in a validated transfer function equation.

L-TRL 6

Supersystem/system/subsystem interactions in relevant environment are demonstrated. Test representative prototype system in a stress test laboratory or simulated environment. Develop and validate scalable transfer function equations for the entire product as a system with the new technology. Equations include prediction of sum-of-squares performance variation and degradation for the entire product with applied off-nominal variation of the noise factors.

L-TRL 7

Product System Demonstrated Robust in representative environment. Technology prototype transferred to a product commercialization group, and they scaled it to fit within their real product application as an operational system. Demonstration of an actual full-product prototype in the field using the new technology. Transfer function equations for the particular product system instantiation are completely verified. A limited set of remaining control factors are available to adjust the technology within the product against unknown-unknowns. Technology is as robust as any other re-used module in the product.

L-TRL 8

Product Ready for Commercialization and Release to Manufacturing Full Production. Technology has been proven robust across the noise variations of extreme field conditions using hardware built with the production equipment purposefully set up and operated at their upper and lower control limits. Transfer to manufacturing is a non-event to the development staff if L-MRL processes are in place.

L-TRL 9

Experienced Customer Use. Product in use by the customer’s operational environment. This is the end of the last validation aspects of true system development. The performance of the product and the technology perform to customer satisfaction in spite of uncontrollable perturbations in the system environment in which the product is embedded or in other external perturbation. Transfer to the customer is a non-event to the engineering staff if L-TRL and L-MRL processes are in place.

Source: Tang and Otto (2009)

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Tang, V., Otto, K., Seering, W. (2018). Does-It-Work? Metrology for Functionality and Efficacy. In: Executive Decision Synthesis. Contributions to Management Science. Springer, Cham. https://doi.org/10.1007/978-3-319-63026-7_4

Download citation

Publish with us

Policies and ethics