Skip to main content
Log in

Incidence matrix approach for calculating readiness levels

  • Published:
Journal of Systems Science and Systems Engineering Aims and scope Submit manuscript

Abstract

Contemporary system maturity assessment approaches have failed to provide robust quantitative system evaluations resulting in increased program costs and developmental risks. Standard assessment metrics, such as Technology Readiness Levels (TRL), do not sufficiently evaluate increasingly complex systems. The System Readiness Level (SRL) is a newly developed system development metric that is a mathematical function of TRL and Integration Readiness Level (IRL) values for the components and connections of a particular system. SRL acceptance has been hindered because of concerns over SRL mathematical operations that may lead to inaccurate system readiness assessments. These inaccurate system readiness assessments are called readiness reversals. A new SRL calculation method using incidence matrices is proposed to alleviate these mathematical concerns. The presence of SRL readiness reversal is modeled for four SRL calculation methods across several system configurations. Logistic regression analysis demonstrates that the proposed Incidence Matrix SRL (IMSRL) method has a decreased presence of readiness reversal than other approaches suggested in the literature. Viable SRL methods will foster greater SRL adoption by systems engineering professionals and will support system development risk reduction goals.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Azizian, N., Sarkani, S., & Mazzuchi, T. (2009). A comprehensive review and analysis of maturity assessment approaches for improved decision support to achieve efficient defense acquisition. Lecture Notes in Engineering And Computer Science, 2179(1):1150–57.

    Google Scholar 

  2. Azizian, N., Mazzuchi, T., Sarkani, S., & Rico, D. F. (2011). A framework for evaluating technology readiness, system quality, and program performance of U.S. DoD Acquisition Systems Engineering, 14(4): 410–426.

    Article  Google Scholar 

  3. Balbuena, C. (2008). Incidence matrices of projective planes and of some regular bipartite graphs of girth 6 with few vertices. Society for Industrial and Applied Mathematics Journal on Discrete Mathematics, 22(4): 1351–1363.

    MATH  MathSciNet  Google Scholar 

  4. Baron, N. T., Holland, O. T., Marchette, D. J., & Wallace, S. E. (2011). Integrating through the fire control loop. National Fire Control Symposium, 19. Lake Buena Vista, FL.

    Google Scholar 

  5. Bilbro, J. W. (2007). A suite of tools for technology assessment. AFRL Maturity Conference, Virginia Beach, VA, September 11–13.

    Google Scholar 

  6. Box, G. E. P., & Draper, N. R., (1987). Empirical model building and response surfaces. John Wiley & Sons, Inc., New York, NY.

    MATH  Google Scholar 

  7. Bowles, J. B. (2004). An assessment of RPN prioritization in a failure modes effects and criticality analysis. Journal of IEST. 47:51–56.

    Google Scholar 

  8. Dacus, C. L. (2012). Improving acquisition outcomes through simples system technology readiness metrics. Defense Acquisition Review Journal, 19(4): 444–461.

    Google Scholar 

  9. DeNezza, E. J., & Casey, A. G. (2014). Future space system acquisitions: is the key “what” or “when”?. Defense Acquisition Technology and Logistics, 9: 9–14.

    Google Scholar 

  10. Department of Defense (2009). Technology Readiness Assessment (TRA) Deskbook, Washington, DC.

    Google Scholar 

  11. Diestel, R. (2010). Graph theory. 4th ed. Heidelberg: Springer.

    Book  Google Scholar 

  12. Engle, M., Sarkani, S., & Mazzuchi, T. (2009). Technical maturity evaluations for sensor fusion technologies. IEEE Applied Imagery Pattern Recognition Workshop (AIPRW), 14–16.

    Google Scholar 

  13. Forbes, E., Volkert, R., Gentile, P., Michaud, K., & Sondi, T. (2009). Implementation of a methodology supporting a comprehensive system-of-systems maturity analysis for use by the littoral compact ship mission module program. INCOSE Chesapeake Chapter Dinner Meeting, 19 August 2009.

    Google Scholar 

  14. Fulkerson, D. R., & Gross, O. (1965). Incidence matrices and interval graphs. Pacific Journal of Mathematics, 3: 835–855.

    Article  MathSciNet  Google Scholar 

  15. Government Accountability Office (1999). Best practices: better management of technology development can improve weapon system outcomes. GAO/NSIAD-99-162, Washington, DC.

    Google Scholar 

  16. Government Accountability Office (2006) Best practices: stronger practices needed to improve DOD technology transition processes. GAO-06-883, Washington, DC.

    Google Scholar 

  17. Government Accountability Office (2013) Defense acquisition: assessment of selected weapon systems, GAO-13-294SP, Washington, DC.

    Google Scholar 

  18. Garrett, R. K., Anderson, S., Baron, N. T., & Moreland, J. D. (2011). Managing the interstitials, a system of systems framework suited for the ballistic missile defense system. Systems Engineering, 14(1): 87–109.

    Article  Google Scholar 

  19. Gross, J. L., & Yellen, J. (2006). Graph theory and its applications. Chapman & Hall/CRC, Boca Raton.

    MATH  Google Scholar 

  20. Hosmer, D. W., & Lemeshow, S. (2000). Applied logistic regression, 2nd Ed. Wiley Series in Probability and Statistics, Wiley & Sons.

    Book  MATH  Google Scholar 

  21. Kober, B. J. & Sauser, B. (2008). A case study in implementing a system maturity metric. 29th American Society of Engineering Management Conference, October, 2008, West Point, NY.

    Google Scholar 

  22. Kujawski, E. (2010). The trouble with the system readiness level (SRL) index for managing the acquisition of defense systems. 13th Annual National Defense and Industrial Association Systems Engineering Conference, 24. San Diego, CA.

    Google Scholar 

  23. Kujawski, E. (2013). Analysis and critique of the system readiness level. Institute of Electrical and Electronics Engineers Transactions on Systems, Man, and Cybernetics-Part A, 99: 1–9.

    Google Scholar 

  24. Long, J. M. (2011). Integration readiness levels. IEEE Aerospace Conference, 5–12 March 2011.

    Google Scholar 

  25. Mahafza, S., Componation, P., & Tippett, D. (2004). A Performance-based technology assessment methodology to support DOD acquisition. Defense Acquisition Review Journal, 11(3): 268–283.

    Google Scholar 

  26. Malone, P., & Wolfarth, L.. System-of-systems: an architectural framework to support development cost strategies. IEEE Aerospace Conference, 2012.

    Google Scholar 

  27. Mandelbaum, J. (2005). Enabling technology readiness assessments (TRAs) with systems engineering. National Defense Industrial Association 8th Annual Systems Engineering Conference.

    Google Scholar 

  28. Mankins, J. C. (2002). Approaches to strategic research and technology (R&T) analysis and road mapping. Acta Astronautica, 51(1): 3–21.

    Article  Google Scholar 

  29. Marchette, D. (2013). An analysis of system readiness functions. Joint Mathematics Meetings, San Diego, CA, 30 January 2013.

    Google Scholar 

  30. Matlab (Version 7.12.0.635, R2011a), Computer Software, Mathworks.

  31. McCabe, T. J. (1976). A complexity measure. IEEE Transactions On Software Engineering, 4: 308–320.

    Article  MathSciNet  Google Scholar 

  32. McConkie, E. B. (2013). A systems engineering approach to mathematical properties of system readiness levels. (Doctoral Dissertation). The George Washington University, ProQuest UMI No. 3543951.

    Google Scholar 

  33. McConkie, E. B., Mazzuchi, T. A., Sarkani, S., & Marchette, D. (2013). Mathematical properties of system readiness levels. Systems Engineering, 16(4): 391–400.

    Article  Google Scholar 

  34. Meier, S. R. (2008). Best project management and systems engineering practices in the preacquisition phase for federal intelligence and defense agencies. Project Management Journal, 39(1): 59–71.

    Article  Google Scholar 

  35. Minitab (Version 17), (Computer Software), Minitab, Inc.

  36. Pampel, F. C. (2000). Logistic regression: a primer. Quantitative Applications in the Social Sciences. Ed. Michael S. Lewis-Beck, Sage Publications, Inc.

  37. Peduzzi, P., Concato, J., Kemper, E., Holford, T. R., & Feinstein, A. R. (1996). Simulation study of the number of events per variable in logistic regression analysis. Journal of Clinical Epidemiology, 49(12): 1373–1379.

    Article  Google Scholar 

  38. Peng, C.-Y. J., Lee, K. K., & Ingersoll, G. M. (2002). An introduction to logistic regression analysis and reporting. Journal of Educational Research, 96(1): 3–14.

    Article  Google Scholar 

  39. Raftery, A. E. (1995). Bayesian model selection in social research. Sociological Methodology, 25: 111–163.

    Article  Google Scholar 

  40. Ramirez-Marquez, J. E., & Sauser, B. J. (2009). System development planning via system maturity optimization. Institute of Electrical and Electronics Engineers Transactions on Engineering Management, 56(3): 533–548.

    Google Scholar 

  41. Sauser, B. J., Forbes, E., Long, M., & McGrory, S. E. (2009). Defining an integration readiness level for defense acquisition. International Symposium of International Council on Systems Engineering. Singapore, 2009.

    Google Scholar 

  42. Sauser, B., & Ramirez-Marquez, J. E. (2012). Multi-objective optimization of system capability satisficing in defense acquisition. Defense Technology Information Center, Accession No. ADA56334330, April 2012.

    Google Scholar 

  43. Sauser, B. J., Ramirez-Marquez, J. E., Henry, D., & DiMarzio, D. (2008). A system maturity index for the systems engineerig life cycle. International Journal of Industrial and Systems Engineering, 3(6): 673–691.

    Article  Google Scholar 

  44. Sauser, B. J., Ramirez-Marquez, J. E., Magnaye, R. B., & Tan, W. (2008). A systems approach to expanding the technology readiness level within defense acquisition. International Journal of Defense Acquisition Management, 1: 39–58.

    Google Scholar 

  45. Sauser, B. J., Ramirez-Marquez, J. E., Nowicki, D., Deshmukh, A., & Sarfaraz, M. (2011). Development of systems engineering maturity models and management tools. DTIC Technical Report, 2011-TR-014.

    Google Scholar 

  46. Sauser, B. J., Ramirez-Marquez, J. E., Verma, D., & Gove, R. (2006). Determining system interoperability using an integration readiness level. Stevens Institute of Technology, 21.

  47. Sauser, B. J., Verma, D., Ramirez-Marquez, J. E., & Gove, R. (2006). From TRL to SRL: the concept of system readiness levels. Conference on Systems Engineering Research, Stevens Institute of Technology, 1-10.

  48. Smith, J. D. (2005). An alternative to technology readiness levels for non-developmental item (NDI) software. Proceedings of the 38th Annual Hawaii International Conference on System Sciences, HICSS’05.

  49. Stevens, S. S. (1946). On the theory of scales of measurement. Science, 103(2684): 677–680.

    Article  MATH  Google Scholar 

  50. Tan, W., Ramirez-Marquez, J. E., & Sauser, B. (2011). A probabilistic approach to system maturity assessment. Systems Engineering, 14(3): 279–293.

    Article  Google Scholar 

  51. Tan, W., Sauser, B. J., Ramirez-Marquez, J. E., & Magnaye, R. B. (2013). Multiobjective optimization in multifunction multicapability system development planning. Systems, Man, and Cybernetics: Systems, IEEE Transactions on, 43(4): 785–800.

    Google Scholar 

  52. Tetlay, A., & John, P. (2009). Determining the lines of system maturity, system readiness and capability readiness in the system development lifecycle. 7th Annual Conference on Systems Engineering Research, 2009.

    Google Scholar 

  53. Townsend, J. T., & Ashby, F. G. (1984). Measurement scales and statistics: the misconception misconceived. Psychological Bulletin, 96(2): 394–401.

    Article  Google Scholar 

  54. Valerdi, R., & Kohl, R. (2004). An approach to technology risk management. Engineering Systems Division Symposium, March: 29–31.

    Google Scholar 

  55. Velleman, P. F., & Wilkinson, L. (1993). Nominal, ordinal, interval, and ratio typologies are misleading. The American Statistician, 47(1): 65–72.

    Google Scholar 

  56. Volkert, R., Stracener, J., & Yu, J. (2013). Incorporating a measure of uncertainty into systems of systems development performance measures. Systems Engineering: 1–17.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mark A. London.

Additional information

Mr. Mark A. London is a PhD student in the Department of Engineering Management and Systems Engineering at the George Washington University. He has earned BS and MS degrees in Electrical Engineering from the Pennsylvania State University in 1999 and 2001 respectively. Mr. London has extensive experience with performance testing and requirements verification of advanced electro-optical and infrared systems. His research interests include developing system maturity metrics for system technical performance measurement, robust statistical methods for improved test design, and integrating advanced system verification processes into developmental and operational test programs. Mr. London is a member of the International Conference on Systems Engineering (INCOSE) and an active member of the International Test and Evaluation Association (ITEA). He has also recently obtained certification as an ITEA Certified Test and Evaluation Professional (CTEP).

Thomas H. Holzer, D.Sc. is an Adjunct Professor of Engineering Management and Systems Engineering, at The George Washington University. He was the Director, Engineering Management Office, National Geospatial-Intelligence Agency with over 35 years experience in systems engineering and leading large-scale information technology programs. Dr. Holzer has a Doctor and Master of Science in Engineering Management from George Washington and a Bachelor of Science in Mechanical Engineering from the University of Cincinnati.

Dr. Tim Eveleigh is an adjunct professor of engineering management and systems engineering at The George Washington University and an INCOSE Certified Systems Engineering Professional. Dr. Eveleigh has over 30 years industry experience working DoD and Intelligence Community IT acquisition challenges, R&D, and enterprise architecting. Dr. Eveleigh has a 30 year parallel career as an Air Force Reserve Intelligence Officer and Developmental Engineer focused on command and control integration.

Dr. Shahryar Sarkani is an Adjunct Professor in the Department of Engineering Management and Systems Engineering at George Washington University, Washington, DC. He has over 20 years of experience in field of software engineering. Dr. Sarkani has Doctor of Science in Systems Engineering from George Washington University, a Master of Science in Mathematics from University of New Orleans and a Bachelor of Science in Electrical Engineering from Louisiana State University.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

London, M.A., Holzer, T.H., Eveleigh, T.J. et al. Incidence matrix approach for calculating readiness levels. J. Syst. Sci. Syst. Eng. 23, 377–403 (2014). https://doi.org/10.1007/s11518-014-5255-8

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11518-014-5255-8

Keywords

Navigation