Journal of Signal Processing Systems

, Volume 56, Issue 1, pp 69–89 | Cite as

A Test-oriented Embedded System Production Methodology



In the business world, the use of Agile methodologies has been demonstrated as providing a pro-active, rather than reactive, path for the developer to create defect-free products. Although similarities exist with business desktop and line-of-business systems, the closer connection of signal processing systems to the hardware side of a product, and associated physical constraints, makes the adaption of desktop Agile methodologies for the embedded world difficult; and the adoption of these methodologies by developers problematic. We focus on our experiences in developing test frameworks to support transforming a subset of extreme programming from the world of desktop applications into a suitable embedded domain production methodology. Details are provided of the issues surrounding an Embedded xUnit testing framework that will permit development of digital signal processing applications on a wide range of standalone and multi-processor systems in research, teaching and commercial development environments.


Test framework for embedded system software Embedded Agile methodologies Test driven development Embedded Unit 



Financial support was provided by Analog Devices and the Natural Sciences and Engineering Council of Canada (NSERC) through a Collaborative Research and Development grant (CRD 299423-03). Early development was supported by an ASRA grant from the Government of Alberta, Canada. MRS is Analog Devices University Ambassador. Contributions for aspects of implementing the Embedded Unit tool set by the following University of Calgary students have been noted through appropriate references in the text: Engineering Internship students A. Martin, A. Kwan and J. Chen; NSERC undergraduate student research award (USRA) winner L. Ko; and graduate students A. Geras, L. Huang and A. Tran.


  1. 1.
    Mead, N., & McGraw, G. (2005). A portal for software security. IEEE Security and Privacy, 3(4), 75–79. doi: 10.1109/MSP.2005.88.CrossRefGoogle Scholar
  2. 2.
    Nebut, C., Fleurey, F., Le Troan, Y., & Jezequel, J. (2006). Automatic test generation: a use case driven approach. IEEE Transactions on Software Engineering, 32(3), 140–155. doi: 10.1109/TSE.2006.22.CrossRefGoogle Scholar
  3. 3.
    Lange, C. F. J., Chaudron, M. R. V., & Muskens, J. (2006). In practice: UML software architecture and design description. IEEE Transactions on Software, 23(2), 40–46. doi: 10.1109/MS.2006.50.CrossRefGoogle Scholar
  4. 4.
    Correia, J. (2006). Recommendation for the Software Industry During Hard Times, Gartner Dataquest ReportGoogle Scholar
  5. 5.
    Feathers, M., (2006). CppUnit—a C++ unit testing framework, (accessed January 2006).Google Scholar
  6. 6.
    Junit, (2006). Testing Resources for eXtreme Programming, (accessed January 2006).
  7. 7.
    Beck, K. (2000). Extreme Programming Explained, Longman Higher Education.Google Scholar
  8. 8.
    Beck, K. (2002). Test-Driven Development—By Example, Addison-Wesley.Google Scholar
  9. 9.
    Maximillien, M., & Williams, L. (2003). Assessing Test-Driven Development at IBM, In Proceedings of International Conference on Software Engineering.Google Scholar
  10. 10.
    Williams, L., Maximillien, M., & Vouk, M. (2003). Test-Driven Development as a Defect-Reduction Practice, In Proceedings of IEEE International Symposium on Software Reliability Engineering (ISSRE).Google Scholar
  11. 11.
    Ronkainen, J., & Abrahamsson, P. (2003). Software Development Under Stringent Hardware Constraints: Do Agile Methods Have a Chance? In Proceedings of XP 2003, 4th International Conference on eXtreme Programming, Genova, Italy, 1012–1012.Google Scholar
  12. 12.
    Cordeiro, L., Mar, C, Valentin, E., Fabiano Cruz, F., Patrick, D., Barreto, R., et al. (2008). An agile development methodology applied to embedded control software under stringent hardware constraints, ACM SIGSOFT Software Engineering Notes, Volume 33 Number 1, Article #5.Google Scholar
  13. 13.
    Greene, W. (2004). Agile Methods Applied to Embedded Firmware Development, In Proceedings of the Agile Development Conference 71–77.Google Scholar
  14. 14.
    Van Schooenderwoert, N., & Morsicato, R. (2004). Taming the Embedded Tiger—Agile Test Techniques for Embedded Software, In Proceedings of Agile Development Conference, 120–126.Google Scholar
  15. 15.
    Karlesky, M., Williams, G., Bereza, W., & Fletcher, M. Mocking the Embedded World: Test-Driven Development, Continuous Integration, and Design Patterns, Embedded Systems Conference, ECS413 Silicon Valley, San Jose, USA, 2007;, accessed March 2008.
  16. 16.
    Greening, J. (2006). Extreme Programming and Embedded Software Development, (accessed February 2006).
  17. 17.
    Smith, M., Kwan, A., Martin, A., & Miller, J. (2005a). E-TDD—Embedded Test Driven Development: A Tool for Hardware-Software Co-design, In Proceedings of XP 2005, 6th International Conference on eXtreme Programming, Sheffield, UK, 145–153.Google Scholar
  18. 18.
    Smith, M., Martin, A., Huang, L., Bariffi, M., Kwan, A., Flaman, W., et al. (2005b). Thermal Arm Wrestling: A look at test driven development (TDD) in the embedded environment. Circuit Cellar, 176, 34–39.Google Scholar
  19. 19.
    Smith, M., Martin, A., Huang, L., Bariffi, M., Kwan, A., Flaman, W., et al. (2005c). A look at test driven development (TDD) in the embedded environment. Circuit Cellar, 177, 60–67.Google Scholar
  20. 20.
    Miller, J., Smith, M. R., Daeninck, S., Chen, J., Qiao, J., Huang, F., et al. (2006). A XP inspired test-oriented Life-cycle production strategy for building embedded biomedical applications”, In Proceedings of Testing Academic and Industrial Conference, Practical and Research Technologies, Windsor, U.K.Google Scholar
  21. 21.
    Boehm, B., & Turner, R. (2004). Balancing agility with discipline: A guide for the perplexed. Toronto: Addison-Wesley.Google Scholar
  22. 22.
    Mitchell, R., & McKim, J. (2002). Design by Contract, by example, Addison-Wesley.Google Scholar
  23. 23.
    Cunningham, W., (2006). Fit: Framework for Integrated Test (2002): FrameworkHistory (accessed, January, 2006).Google Scholar
  24. 24.
    Chen, J., Smith, M.R., Geras, A., Miller, J., & Ko, L. (2006). Making Fit/FitNesse Appropriate for Biomedical Engineering Research, In Proceeding of XP2006, 7th Int. Conf. eXtreme Programming and Agile Processes in Software, Oulu, Finland. 186–190.Google Scholar
  25. 25.
    Deshwar, A., Chen, J., & Smith, M. R. (2006). Embedded Fitness: An Agile Development Tool for Biomedical Embedded Systems, In Proceedings of 7th Alberta Biomedical Engineering Conference, Banff, Canada.Google Scholar
  26. 26.
    Smith, M. R., Miller, J., Ko, L., Chen, J., Geras, A., & Frayne, R. (2006). Approaches to Validating the ‘Quantity’ in Quantitative MR Cerebral Perfusion Studies, In Proceedings of MEDSIP 2006, 3rd Institution of Engineering and Technology International Conference on Advances in Medical, Signal and Information Processing, Glasgow, UKGoogle Scholar
  27. 27.
    Geras, A. (2006). Fit and MatLab,∼ageras/testml/ (accessed, January 2006)
  28. 28.
    Phelan, B., 2006. (accessed: April 2006).Google Scholar
  29. 29.
  30. 30.
    Smith, M. R., Lu, H., Trochet, S., & Frayne, R. (2004). Removing the effect of SVD algorithmic artifacts present in quantitative MR perfusion studies. Magnetic Resonance in Medicine, 51(4), 631–634. doi: 10.1002/mrm.20006.CrossRefGoogle Scholar
  31. 31.
    Salluzzi, M., Frayne, R., & Smith, M. R. (2005). An alternative viewpoint of the similarities and differences of SVD and FT deconvolution algorithms used for quantitative MR perfusion studies. Magnetic Resonance in Medicine, 23(3), 481–492. doi: 10.1016/j.mri.2004.12.001.Google Scholar
  32. 32.
    Salluzzi, M., Frayne, R., & Smith, M. R. (2006). Is correction necessary when clinically determining quantitative cerebral perfusion parameters from multi-slice dynamic susceptibility contrast MR studies. Physics in Medicine & Biology, 51(2), 407–424. doi: 10.1088/0031-9155/51/2/015.CrossRefGoogle Scholar
  33. 33.
    Analog (2006). Analog Devices Processors, (accessed, October 2006).
  34. 34.
    Holzmann, G. J. (2006). The power of 10: rules for developing safety-critical code. IEEE Computer, 39(6), 95–97.Google Scholar
  35. 35.
    Do, H., Rothermel, G., & Kinneer, A. (2006). Prioritizing JUnit test cases: an empirical assessment and cost–benefits analysis. Empirical Software Engineering, 11(1), 33–70. doi: 10.1007/s10664-006-5965-8.CrossRefGoogle Scholar
  36. 36.
    Saff, D., & Ernst, M. D. (2003). Reducing wasted development time via continuous testing, In Proceedings of 14th International Symposium on Software Reliability Engineering, (Denver, CO), 281–292.Google Scholar
  37. 37.
    Mugridge, R. (2003). Test Driven Development and the Scientific Method, In Proceedings of the Agile Development Conference, 47–52.Google Scholar
  38. 38.
    Ko, L., Miller, J. Frayne. R., & Smith, M. R. (2005). Advantages of Test Driven Development for MATLAB Simulation and Code Transfer, In Proceedings of 6th Alberta Biomedical Engineering Conference, Banff, Canada.Google Scholar
  39. 39.
    Simulink, (2006). (accessed: April 2006).
  40. 40.
    Myer, B. (2005). Eiffel: The language (in preparation), available from∼meyer/ongoing/etl/ (user name: Talkitover; password: etl3) (accessed June 2005).
  41. 41.
    Kernighan, B. W., & Ritchie, D. M. (1988). The C Programming Language (2nd Edition). New Jersey: Prentice Hall.Google Scholar
  42. 42.
    Smith, M. R. (2002). Application of a Big Business Project Management Tool to Optimize Microprocessor Resource Usage, In Proceedings of 10th IEEE DSP Workshop, Georgia, 2002 4.4.1–4.4.6.Google Scholar
  43. 43.
    Smith, M. R., & Miller, J. (2005). Automatic Microcontroller Task Scheduling: An unusual application of Microsoft Project. Circuit Cellar, 26–35Google Scholar
  44. 44.
    George, B., & Williams, L. (2002). An Initial Investigation of Test-Driven Development In Industry, ACM Symposium on Applied Computing (SAC), March 2002.Google Scholar
  45. 45.
    Daeninck, S. (2005). Test Driven Development of a Video Surveillance System, M. Eng. Final Project, Electrical Engineering, University of Calgary, Canada T2N 1N4.Google Scholar
  46. 46.
    Daeninck, S., Smith, M. R., Miller, J., & Ko, L. (2006). Extending the embedded system E-TDD test driven development tool for the development of a real-time video security system. In Proceedings of XP2006, 7th International Conference on eXtreme Programming and Agile Processes in Software, XP2006, Finland, 200–201.Google Scholar
  47. 47.
    Huang, F., Smith, M. R., Tran, A., & Miller, J. (2008). “E-RACE, A Hardware Assisted Approach to Lockset-based Data Race Detection on Embedded Product”. 19th International Symposium on Software Reliability Engineering, Nov., Seattle, USA. Accepted as a Fast Abstract, August 2008.Google Scholar
  48. 48.
    Shye, A., Iyer, M., Reddi, V. J., & Connors, D. A. (2005). Code coverage testing using hardware performance monitoring support. Proceedings of the Sixth International Symposium on Automated Analysis-driven Debugging, 05, 159–163. doi: 10.1145/1085130.1085151.CrossRefGoogle Scholar
  49. 49.
    Tran, A., Smith, M. R., Huang, F., Miller, J. (2008). “A high-performance hardware-instrumented approach to test coverage for embedded systems”, 19th International Symposium on Software Reliability Engineering, Nov., Seattle, USA. Accepted as a Student Paper, August 2008.Google Scholar
  50. 50.
    Embedded-UnitTest Framework download (accessed August, 2008).Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2008

Authors and Affiliations

  1. 1.Department of Electrical and Computer EngineeringUniversity of CalgaryCalgaryCanada
  2. 2.Department of RadiologyUniversity of CalgaryCalgaryCanada
  3. 3.Department of Electrical and Computer EngineeringUniversity of AlbertaEdmontonCanada
  4. 4.Now with Blackline GPSCalgaryCanada

Personalised recommendations