Characteristics of Large-Scale Defense Projects and the Dominance of Software and Software Project Management

  • Kadir Alpaslan DemirEmail author
Part of the Computer Communications and Networks book series (CCN)


Countries spend billions of dollars and use a significant amount of resources on large-scale defense system projects. Many reports indicate that large-scale defense systems are among the most challenging and risky projects. One reason can be attributed to the increasing use of software in defense systems. Today, large-scale defense systems are mainly composed of software. For example, 90% of the functions in an F-35 fighter aircraft are achieved via software. Software development is inherently challenging even without the defense context. Investigating solutions to these challenges should first start with identification of characteristics of large-scale defense system projects. In addition, as the use of software increases in defense systems, defense project management is actually becoming software project management. Consequently, defense project managers should know about software project management as well to be successful in their projects. In this chapter, we first identify the characteristics of large-scale defense systems. Then, we list the characteristics of large-scale defense projects. We also emphasize and discuss the influence of software in defense projects and how defense project management is, in fact, becoming software project management.


Defense systems Defense software Military software Large-scale defense systems Project management Software project management Defense project management 


Disclaimer and Acknowledgments

The views and conclusions contained herein are those of the author and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of any affiliated organization or government.


  1. 1.
    Stockholm International Peace Research Institute (SIPRI). Military Expenditure Database (2015) Accessed 29 Sept 2015
  2. 2.
    Garrett RK, Anderson S, Baron NT, Moreland JD (2011) Managing the interstitials, a system of systems framework suited for the ballistic missile defense system. Syst Eng 14(1):87–109CrossRefGoogle Scholar
  3. 3.
    U.S. Government Accountability Office (2009) High risk series, Report No: GAO-09-271Google Scholar
  4. 4.
    Hagen C, Sorenson J, Hurt S, Wall D (2012) Software: the brains behind U.S. Defense Systems, A.T. Kearney Inc. Accessed 29 Sept 2015
  5. 5.
    Hagen C, Sorensen J (2013) Delivering military systems affordably, Defense AT&LGoogle Scholar
  6. 6.
    U.S. Government Accountability Office (2015) Report to congressional committees: defense acquisitions – assessments of selected weapon programs, Report No: GAO-15-342SP Accessed 29 Sept 2015
  7. 7.
    U.S. Government Accountability Office (2015) Report to Congressional Committees: high risk series, Report No: GAO-15-290. Accessed 29 Sept 2015
  8. 8.
    U.S. Government Accountability Office (1999) Performance and accountability series: major management challenges and program risks – department of defense, GAO/OCG-99-4. Accessed 29 Sept 2015
  9. 9.
    United States Government Accountability Office, Report to Congressional Committees (2015) Annual report: additional opportunities to reduce fragmentation, overlap, and duplication and achieve other financial benefits, Report No: GAO-15-404SP. Accessed 29 Sept 2015
  10. 10.
    Jones C (2000) Software assessments, benchmarks, and best practices. Addison-Wesley Longman Publishing Co., IncGoogle Scholar
  11. 11.
    Jones C (2002) Defense software development in evolution, Crosstalk –J Def Softw Eng. Accessed 29 Sept 2015
  12. 12.
    MODUK – British Defense Standards, Def Stan 07–85 – Design Requirements for Weapons and Associated SystemsGoogle Scholar
  13. 13.
    U.S. MIL-STD 498 (1994) Software development and documentationGoogle Scholar
  14. 14.
    Overview of U.S. MIL-STD 498. Accessed 29 Sept 2015
  15. 15.
    Parnas DL (1985) Software aspects of strategic defense systems. Commun ACM 28(12):1326–1335. doi: 10.1145/214956.214961 CrossRefGoogle Scholar
  16. 16.
    Humphrey WS (2005) Why big software projects fail: the 12 key questions. Crosstalk – J Def Softw Eng. Accessed 29 Sept 2015
  17. 17.
    Northrop L (2013) Does scale really matter? Ultra-large-scale systems seven years after the study (keynote). In: Proceedings of 2013 35th international conference on software engineering (ICSE), 18–26 May 2013, San Francisco, CA, USA, pp 857–857. doi: 10.1109/ICSE.2013.6606633. Accessed 29 Sept 2015
  18. 18.
    Northrop L et al (2006) Ultra-large-scale systems: the software challenge of the future. Carnegie-Mellon university, Software Engineering Institute (SEI), Pittsburgh. Accessed 29 Sept 2015
  19. 19.
    Demir KA (2009) Challenges of weapon systems software development. Journal of Naval Science and Engineering 5(3):104–116, Accessed 19 Feb 2017
  20. 20.
    Herndon RL (1983) The Army’s National Training Center: a case study in management of a large defense project. Army Military Personnel Center, Alexandria, VA. (M.S. Thesis) Accessed 27 Sept 2015
  21. 21.
    Astan G (2015) Factors effecting technology acquisition decisions in national defense projects. J Def Res Manag 6(1):97–102. Accessed 27 Sept 2015
  22. 22.
    BKCASE Editorial Board (2015) The guide to the Systems Engineering Body of Knowledge (SEBoK), v. 1.4. R.D. Adcock (EIC). Hoboken: The Trustees of the Stevens Institute of Technology. BKCASE is managed and maintained by the Stevens Institute of Technology Systems Engineering Research Center, the International Council on Systems Engineering, and the Institute of Electrical and Electronics Engineers Computer Society. Accessed 29 Sept 2015
  23. 23.
    Bartholomew R, Collins R (2009) Evaluating an immersive virtual environment for organizationally distributed software development. In: AIAA Infotech@ Aerospace Conference, 6–9 April, Seattle, Washington, USAGoogle Scholar
  24. 24.
    Goldin L, Matalon-Beck M, Lapid-Maoz J (2010) Reuse of requirements reduces time to market. In: Proceedings of 2010 IEEE international conference on Software Science, Technology and Engineering (SWSTE), pp 55–60. 15–16 June 2010, Herzlia, Israel. doi: 10.1109/SwSTE.2010.17
  25. 25.
    Nelson M, Clark J, Spurlock MA (1999) Curing the software requirements and cost estimating blues, PM Magazine, November–December, pp 54–60Google Scholar
  26. 26.
    Fleming QW, Koppelman JM (2006) Earned Value Project Management, 3rd edn. Project Management Institute, June 30, 2006Google Scholar
  27. 27.
    Fleming QW, Koppelman JM (1998) Earned value project management a powerful tool for software projects. Crosstalk – J Def Softw Eng. Accessed 29 Sept 2015
  28. 28.
    Tomasetti R, Cohe S, Buchholz M (2005) Earned value management – moving toward government-wide implementation. Acquisitions Directions AdvisoryGoogle Scholar
  29. 29.
    Project Management Institute (2013) A guide to the Project Management Body of Knowledge (PMBOK Guide), 5th edn. Accessed 29 Sept 2015
  30. 30.
    U.S. Department of Defense Instruction (DoDI) 5000.2 (2015) Operation of the defense acquisition system. Accessed 29 Sept 2015
  31. 31.
    Jones C (1998) Project management tools and software failures and successes, Crosstalk – J Def Softw Eng. Accessed 27 Sept 2015
  32. 32.
  33. 33.
  34. 34.
    Jones C (2007) Estimating software costs, 2nd edn. McGraw-HillGoogle Scholar
  35. 35.
    Defense Acquisition Guidebook (2013) Accessed 27 Sept 2015
  36. 36.
    Office of the Deputy Under Secretary of Defense for Acquisition and Technology (2008) Systems and software engineering. systems engineering guide for systems of systems, Version 1.0. ODUSD (A&T) SSE, Washington, DC. Accessed 27 Sept 2015
  37. 37.
    Schonenborg RAC, Bieler T, Matthyssen A, Fijneman M (2010) System of systems architecture in ESA’s concurrent design facility. Proc SECESA 2010:13–15Google Scholar
  38. 38.
    Boehm B (2006) Some future trends and implications for systems and software engineering processes. Syst Eng 9(1):1–19.
  39. 39.
    Owens WA (1996) The emerging US system-of-systems (No. 63). National Defense University, Institute for National Strategic Studies, Washington, DC. Accessed 27 Sept 2015
  40. 40.
    Alberts DS, Garstka JJ, Stein FP (2000) Network centric warfare: developing and leveraging information superiority. Assistant Secretary Of Defense (C3I/Command Control Research Program) Washington, DCGoogle Scholar
  41. 41.
    Alberts DS, Garstka JJ, Hayes RE, Signori DA (2001) Understanding information age warfare. Assistant Secretary Of Defense (C3I/Command Control Research Program) Washington, DCGoogle Scholar
  42. 42.
    Cebrowski AK, Garstka JJ (1998) Network-centric warfare: its origin and future. In: US Naval Institute Proceedings, vol 124, no 1, pp 28–35. Accessed 27 Sept 2015
  43. 43.
    Dahmann JS, Lane JA, Rebovich G (2008) Systems engineering for capabilities. Crosstalk – J Def Softw Eng. Accessed 27 Sept 2015
  44. 44.
    US Joint Vision 2010. Accessed 28 Sept 2015
  45. 45.
    US Joint Vision 2020. Accessed 17 Feb 2017
  46. 46.
    Schekkerman J (2004) How to survive in the jungle of enterprise architecture frameworks: creating or choosing an enterprise architecture framework. Trafford PublishingGoogle Scholar
  47. 47.
    Reichwein A, Paredis CJ (2011) Overview of architecture frameworks and modeling languages for model-based systems engineering. In: Proceedings of ASME 2011 international design engineering technical conferences and computers and information in engineering conference, pp 1341–1349Google Scholar
  48. 48.
    Urbaczewski L, Mrdalj S (2006) A comparison of enterprise architecture frameworks. Issues Inf Syst 7(2):18–23Google Scholar
  49. 49.
    Zachman J (1987) A framework for information systems architecture. IBM Syst J 26(3):276–292. doi: 10.1147/sj.263.0276 CrossRefGoogle Scholar
  50. 50.
    The Open Group (2015) TOGAF version 9.1. Accessed 29 Sept 2015
  51. 51.
    U.S. Federal Enterprise Architecture Framework (FEAF) (2015). Federal Enterprise Architecture Framework Version 2. Accessed 29 Sept 2015
  52. 52.
    U.S. Department of Defense. The DoDAF Architecture Framework Version 2.02. Accessed 29 Sept 2015
  53. 53.
    The British Ministry of Defence Architecture Framework (MODAF). Accessed 4 July 2016
  54. 54.
    NATO Architecture Framework (NAF) Version 4.0. Accessed 4 July 2016
  55. 55.
    Object Management Group (OMG) Unified Architecture Framework (UAF). Accessed 29 Sept 2015
  56. 56.
    Spruill N (2002) Now more than ever, software is the heart of our weapons systems, Crosstalk-The Journal of Defense Software Engineering 3. Accessed 4 July 2016
  57. 57.
    Demir KA (2005) Analysis of TLCharts for weapon systems software development. Masters' Thesis, Naval Postgraduate School, Monterey, CA, USA, December 2005. Accessed 19 Feb 2017
  58. 58.
    U.S. Government Accountability Office (1992) Mission-critical systems – defense attempting to address major software challenges, GAO/IMTEC-93-13, December. Accessed 4 July 2016
  59. 59.
    Portnoi M, Shen CC (2013) Secure zones: an attribute-based encryption advisory system for safe firearms. In: Proceedings of 2013 IEEE conference on Communications and Network Security (CNS), pp 397–398. 14–16 Oct. 2013. National Harbor, MD, USA doi: 10.1109/CNS.2013.6682746
  60. 60.
    Milde KF, Jr (2015) U.S. Patent No. 8,931,195. U.S. Patent and Trademark Office, Washington, DCGoogle Scholar
  61. 61.
    Dietel B (2014) U.S. Patent No. 8,756,850. U.S. Patent and Trademark Office, Washington, DCGoogle Scholar
  62. 62.
    Nielsen PD (2015) Software engineering and the persistent pursuit of software quality. J Def Softw Eng:4–9. Accessed 4 July 2016
  63. 63.
    USAF (1992) “Bold Strike” executive software courseGoogle Scholar
  64. 64.
    Judas PA, Prokop LE (2011) A historical compilation of software metrics with applicability to NASA’s Orion spacecraft flight software sizing. Innov Syst Softw Eng 7(3):161–170CrossRefGoogle Scholar
  65. 65.
  66. 66.
    Henties T, Hunt JJ, Locke D, Nilsen K, Schoeberl M, Vitek J (2009) Java for safety-critical applications. In: 2nd international workshop on the certification of safety-critical software controlled systems (SafeCert 2009)Google Scholar
  67. 67.
    Nilsen K (2004). Using java for reusable embedded real-time component libraries. Crosstalk: J Def Softw Eng:13–18. Accessed 29 Sept 2015
  68. 68.
    Nilsen K (2007) Applying COTS Java benefits to mission-critical real-time software. Crosstalk: J Def Softw Eng:19–24. Accessed 29 Sept 2015
  69. 69.
    Jones C (2006) Social and technical reasons for software project failures. Crosstalk: J Def Softw Eng:4–9. Accessed 29 Sept 2015
  70. 70.
    Sommerer S, Guevara MD, Landis MA, Rizzuto JM, Sheppard JM, Grant CJ (2012) Systems-of-systems engineering in air and missile defense. J Hopkins APL Tech Dig 31(1):5–20. Accessed 29 Sept 2015
  71. 71.
    Drusinsky D, Shing MT, Demir K (2005) Test-time, run-time, and simulation-time temporal assertions in RSP. In: Proceedings of the 16th IEEE International workshop on rapid system prototyping (RSP'05), 8–10 June 2005, Montreal, Canada, pp 105–110Google Scholar
  72. 72.
    Guernsey GG (2009) Integrated test and evaluation (T&E) management: The Information Mission Assessment Tool (IMAT) prototype. PhD dissertation, Union Institute and University, USAGoogle Scholar
  73. 73.
    Gomaa H (2000) Designing concurrent, distributed, and real-time applications with UML. Addison-Wesley, p 8Google Scholar
  74. 74.
    Shalal-Esa A (2012) Pentagon focused on resolving F-35 software issues. Online News from Reuters, 30 March. Accessed 29 Sept 2015
  75. 75.
    Defense Science Board (2000) Report of the defense science board task force on defense software, November, pp. 11.Google Scholar
  76. 76.
    Blanchard BJ, Fabrycky WJ (1998) Systems engineering and analysis, 3rd edn. Prentice Hall International Series in Industrial & Systems Engineering. ISBN: 0131350471Google Scholar
  77. 77.
    United States Government Accountability Office (2008) Report to congressional committees: increased focus on requirements and oversight needed to improve DOD’s acquisition environment and weapon system quality, GAO-08-294. Accessed 27 Sept 2015
  78. 78.
    Lenfestey A, Cring E, Colombi J (2009) Architecting human operator trust in automation for multiple unmanned aerial system (UAS) control. In: Proceedings of software engineering research and practice 2009.121–127. Las Vegas, Nevada, USA. 13–16 JulyGoogle Scholar
  79. 79.
    Cring E, Lenfestey A (2009) Architecting human operator trust in automation for multiple unmanned aerial system (UAS) control. Master’s thesis, Air Force Institute of Technology, USAGoogle Scholar
  80. 80.
    Demir KA, Cicibas H, Arica N (2015) Unmanned aerial vehicle domain: areas of research. Defence Science Journa 65(4):319–329. doi: 10.14429/dsj.65.8631, Accessed 17 Feb 2017
  81. 81.
    Storey NR (1996) Safety critical computer systems. Addison-Wesley Longman Publishing Co., IncGoogle Scholar
  82. 82.
    UK Ministry of Defence Standard Def Stan 00–56 Safety management requirements for defence systemsGoogle Scholar
  83. 83.
    Lee SY, Wong WE, Gao R (2014) Software safety standards: evolution and lessons learned. In: Proceedings of 2014 international conference on Trustworthy Systems and their Applications (TSA), pp 44–50. 9–10 June 2014, Taichung. doi: 10.1109/TSA.2014.16
  84. 84.
    United States, Weapon systems acquisition reform act of 2009. Accessed 27 Sept 2015
  85. 85.
    Steinbock D (2014) The challenges for America’s defense innovation. The Information Technology and Innovation Foundation (ITIF). Accessed 27 Sept 2015
  86. 86.
    United States Government Accountability Office (2015) Report to congressional committees: acquisition reform – DOD should streamline its decision-making process for weapon systems to reduce inefficiencies, Report No: GAO-15-192, Accessed 29 Sept 2015
  87. 87.
    Boehm B, Lane JA (2007) Using the incremental commitment model to integrate system acquisition, systems engineering, and software engineering. Crosstalk – J Def Softw Eng 19(10):4–9Google Scholar
  88. 88.
    United States Government Accountability Office (2006) Managing the supplier base in the 21st century, Report No: GAO-06-533SP. Accessed 29 Sept 2015
  89. 89.
    United States Under Secretary of Defense for Acquisition, Technology, and Logistics And Assistant Secretary of Defense for Networks and Information Integration/DoD Chief Information Officer (2009) Report on Trusted Defense Systems in response to the National Defense Authorization Act for Fiscal Year 2009, December 22. Accessed 29 Sept 2015
  90. 90.
    Demir KA (2016) Strategic human resource management of government defense R&D organizations, CrossTalk J Def Softw Eng 29(2):24–30. Accessed 19 Feb 2017
  91. 91.
    United States Department of Defense Instruction (2012) DoDI, 5200.44. Protection of Mission Critical Functions to Achieve Trusted Systems and Networks (TSN). November 5. Accessed 27 Sept 2015
  92. 92.
    United States Government Accountability Office (2015) Defense acquisition process, military service chiefs’ concerns reflect need to better define requirements before Programs Start Report No: GAO-15-469, Accessed 29 Sept 2015
  93. 93.
    United States Government Accountability Office (2014) Defense Contracting: DOD’s Use of Class Justifications for Sole-Source Contracts, 16, Report No: GAO-14-427R DOD Class Justifications. Accessed 29 Sept 2015
  94. 94.
    U.S. Department of Defense Manual 4120.24 (2014) Defense Standardization Program (DSP) Procedures. Accessed 29 Sept 2015
  95. 95.
    U.S. Department of Defense Directive (DoDD) 5000.01 (2007) Operation of the Defense Acquisition System. DoD Directive 5000.1: DoD. Accessed 20 Sept 2015
  96. 96.
    Department Of Defense Standard Practice, System Safety, MIL-STD-882E, May 11 (2012) Accessed 20 Sept 2015
  97. 97.
    Royce W (1970) Managing the development of large software systems: concepts and techniques. In: Proceedings of IEEE WESCOM. IEEE Computer Society Press, Los AlamitosGoogle Scholar
  98. 98.
    Petersen K, Wohlin C, Baca D (2009) The waterfall model in large-scale development. In: Product-focused software process improvement. Springer, Berlin/Heidelberg, pp 386–400Google Scholar
  99. 99.
    Hirschberg M (2000) The V model, Crosstalk – J Def Softw Eng,. Accessed 29 Sept 2015
  100. 100.
    German Directive 250 (1992) Software development standard for the German Federal Armed Forces, V-Model, Software Lifecycle Process ModelGoogle Scholar
  101. 101.
    US Department of Defense Standard (1988) DOD-STD-2167A Defense Systems Software DevelopmentGoogle Scholar
  102. 102.
    United States Government Accountability Office (2009) Defense acquisitions – charting a course for improved missile defense testing, 25, Report No: GAO-09-403T. Accessed 29 Sept 2015
  103. 103.
    Software Engineering Institute (2010) CMMI® for Acquisition (CMMI-ACQ) Version 1.3, Technical Report: CMU/SEI-2010-TR-032. Accessed 29 Sept 2015
  104. 104.
    Software Engineering Institute, Carnegie Mellon University (2010) CMMI for Development (CMMI-DEV) Version 1.3, 2010. Technical Report: CMU/SEI-2010-TR-033Google Scholar
  105. 105.
    Software Engineering Institute (2010) CMMI® for Services (CMMI-DEV), Version 1.3. Technical Report: CMU/SEI-2010-TR-032, November 2010. Accessed 29 Sept 2015
  106. 106.
    Barbour R (2006) CMMI DoD perspective. Presentation. Accessed 29 Sept 2015
  107. 107.
    Phillips M, Shrum S (2010) Process improvement for all: what to expect from CMMI Version 1.3. Crosstalk – J Def Softw Eng, Accessed 27 Sept 2015
  108. 108.
    Lehman M (1980) Programs, life cycles, and laws of software evolution. Proc IEEE 68(9):1060–1076. doi: 10.1109/PROC.1980.11805 CrossRefGoogle Scholar
  109. 109.
    Larson AG, Banning CK, Leonard JF (2002) An open systems approach to supportability. WALCOFF Technologies Inc., Fairfax. Accessed 27 Sept 2015
  110. 110.
    Dahmann J, Baldwin K (2011) Implications of systems of systems on system design and engineering. In: Proceedings of 2011 6th international conference on system of systems engineering (SoSE), pp 131–136. 27–30 June. Albuquerque, NM, USA doi: 10.1109/SYSOSE.2011.5966586
  111. 111.
    Demir KA (2015) Multi-view software architecture design: case study of a mission-critical defense system. Computer and Information Science 8(4):12–31. doi: 10.5539/cis.v8n4p12, Accessed 19 Feb 2017
  112. 112.
    Dahmann J, Rebovich G, Lowry R, Lane J, Baldwin K (2011) An implementers’ view of systems engineering for systems of systems. In: Proceedings of 2011 IEEE international systems conference (SysCon), pp 212–217. 4–7 April. Montreal, QC, Canada. doi: 10.1109/SYSCON.2011.5929039
  113. 113.
    Victor B (2013) Revisiting legacy systems and legacy modernization from the industrial perspective. Masters’ Thesis, University of Utrecht, Utrecht, the NetherlandsGoogle Scholar
  114. 114.
    Software Program Managers Network (SPMN). Accessed 24 Feb 2017
  115. 115.
    Software Program Managers Network, 16 Critical Software Practices. Accessed 24 Feb 2017Google Scholar
  116. 116.
    Evans M (2001) SPMN director identifies 16 critical software practices. CrossTalk – J Def Softw Eng. Accessed 29 Sept 2015
  117. 117.
    Software Acquisition Best Practices Initiative, Software Program Managers Network (SPMN) (1998), The Program Manager’s Guide to Software Acquisition Best Practices Version 2.31Google Scholar
  118. 118.
    Defense Acquisition University (DAU). Accessed 29 Sept 2015
  119. 119.
    Tomasetti R, Cohe S, Buchholz M (2005) Earned value management moving toward Governmentwide implementation. Acquisition Directions. Accessed 27 Sept 2015
  120. 120.
    Report of the Defense Science Board Task Force on Defense Software (2000) Defense Science Board, p 11. Accessed 27 Sept 2015
  121. 121.
    Report of the Defense Science Board Task Force on Mission Impact of Foreign Influence on DoD Software (2007) Defense Science Board, pp 11. Accessed 29 Sept 2015
  122. 122.
    Common Criteria. Accessed 29 Sept 2015
  123. 123.
    DO-178C (2011) Software considerations in airborne systems and equipment certificationGoogle Scholar
  124. 124.
    ISO/IEC/IEEE 15288–2008. Systems and software engineering – system life cycle processes. doi: 10.1109/IEEESTD.2008.4475828, Accessed 29 Sept 2015
  125. 125.
    ISO/IEC/IEEE 15289–2011. Systems and software engineering – content of life-cycle information products (documentation). doi: 10.1109/IEEESTD.2011.6104079, Accessed 29 Sept 2015
  126. 126.
    ISO/IEC/IEEE 15289–2015. International standard systems and software engineering – content of life-cycle information items (documentation). Accessed 29 Sept 2015
  127. 127.
    Demir KA (2009) A survey on challenges of software project management. In: Proceedings of software engineering research and practice (SERP 2009), July 13–16, 2009, Las Vegas, Nevada, USA, pp 579–585. . Accessed 19 Feb 2017
  128. 128.
    Rendon RG (2007) Using a modular open systems approach in defense acquisitions: Implications for the contracting process. In: Proceedings of IEEE international conference on system of systems engineering (SoSE’07). pp 1–6. 16–18 April. San Antonio, TX. doi: 10.1109/SYSOSE.2007.4304231
  129. 129.
    U.S. Government Accounting Office. Accessed 29 Sept 2015
  130. 130.
    Defense Science Board (2013) Resilient military systems and the advanced cyber threat. Accessed 29 Sept 2015
  131. 131.
    Maier MW (1996) Architecting principles for systems-of-systems. INCOSE Int Symp 6(1):565–573CrossRefGoogle Scholar
  132. 132.
    Ferguson J (2001) Crouching dragon, hidden software: software in DoD weapon systems. IEEE Softw 18(4):105. doi:  10.1109/MS.2001.936227 CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Department of Software DevelopmentTurkish Naval Research Center CommandIstanbulTurkey

Personalised recommendations