Modeling Firmware as Service Functions and Its Application to Test Generation

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8244)


The term firmware refers to software that is tied to a specific hardware platform, e.g., low-level drivers that physically interface with the peripherals. More recently, this has grown to include software that manages critical hardware platform functions such as power management. This growing firmware needs to be shipped with the hardware and shares many of the same critical design concerns as the hardware. The two that we address in this paper are: co-design with the other system components, and validation of the firmware interactions with the connected hardware modules. To this end we introduce a specific Service-Function Transaction-Level Model (TLM) for modeling the firmware and interacting hardware components. A service function provides a service in response to a specific trigger, much like an interrupt-service routine responding to an interrupt. While TLM has been used in the past for HW-SW codesign, we show how the particular structure of the proposed service function based model is useful in the context of firmware design. Specifically, we show its application in automatic test generation. Recently concolic testing has emerged as an automated technique for test generation for single-threaded software. This technique cannot be used directly for firmware, which, by definition, runs in parallel with the interacting hardware modules. We show how the service function model proposed here can be used to analyze these interactions and how single-threaded concolic testing can still be used for an important class of these interaction patterns. The model and the test generation are illustrated through a non-trivial case study of the open-source Rockbox MP3 player.


Test Generation Interaction Pattern Shared Variable Service Function Concurrent Program 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Straunstrup, J., Andersen, H., Hulgaard, H., Lind-Nielsen, J., Behrmann, G., Kristoffersen, K., Skou, A., Leerberg, H., Theilgaard, N.: Practical verification of embedded software. Computer 33(5), 68–75 (2000)CrossRefGoogle Scholar
  2. 2.
    Cai, L., Gajski, D.: Transaction level modeling: an overview. In: Proceedings of the Int. Conference on HW/SW Codesign and System Synthesis, pp. 19–24. ACM (2003)Google Scholar
  3. 3.
    Ghenassia, F.: Transaction-level modeling with SystemC: TLM concepts and applications for embedded systems (2005)Google Scholar
  4. 4.
    Mahajan, Y., Chan, C., Bayazit, A., Malik, S., Qin, W.: Verification driven formal architecture and microarchitecture modeling. In: IEEE/ACM MEMOCODE, pp. 123–132 (2007)Google Scholar
  5. 5.
    Majumdar, R., Sen, K.: Hybrid concolic testing. In: 29th International Conference on Software Engineering, ICSE 2007, pp. 416–426 (2007)Google Scholar
  6. 6.
    Cadar, C., Dunbar, D., Engler, D.: KLEE: Unassisted and automatic generation of high-coverage tests for complex systems programs. In: Proceedings of the 8th USENIX Conference on Operating Systems Design and Implementation, pp. 209–224 (2008)Google Scholar
  7. 7.
    Godefroid, P., Klarlund, N., Sen, K.: DART: directed automated random testing. ACM Sigplan Notices 40, 213–223 (2005)CrossRefGoogle Scholar
  8. 8.
    Rockbox - Free Music Player Firmware,
  9. 9.
  10. 10.
  11. 11.
    Vahid, F., Wiley, J.: Digital design. Wiley (2006)Google Scholar
  12. 12.
    Kuck, D.L.: Structure of Computers and Computations. John Wiley & Sons, Inc. (1978)Google Scholar
  13. 13.
  14. 14.
    Corbet, J., Rubini, A., Kroah-Hartman, G.: Linux device drivers. O’reilly (2009)Google Scholar
  15. 15.
    Bellard, F.: Qemu, a fast and portable dynamic translator. In: USENIX Annual Technical Conference, FREENIX Track, pp. 41–46 (2005)Google Scholar
  16. 16.
    King, J.: Symbolic execution and program testing. CACM 19(7), 385–394 (1976)CrossRefzbMATHGoogle Scholar
  17. 17.
    Qu, X., Robinson, B.: A case study of concolic testing tools and their limitations. In: International Symposium on ESEM, pp. 117–126. IEEE (2011)Google Scholar
  18. 18.
    Sen, K., Agha, G.: CUTE and jCUTE: Concolic unit testing and explicit path model-checking tools. In: Ball, T., Jones, R.B. (eds.) CAV 2006. LNCS, vol. 4144, pp. 419–423. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  19. 19.
    Rungta, N., Mercer, E.G., Visser, W.: Efficient testing of concurrent programs with abstraction-guided symbolic execution. In: Păsăreanu, C.S. (ed.) SPIN 2009. LNCS, vol. 5578, pp. 174–191. Springer, Heidelberg (2009)Google Scholar
  20. 20.
    Sen, K., Agha, G.: A race-detection and flipping algorithm for automated testing of multi-threaded programs. In: Bin, E., Ziv, A., Ur, S. (eds.) HVC 2006. LNCS, vol. 4383, pp. 166–182. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  21. 21.
    Godefroid, P.: Partial-Order Methods for the Verification of Concurrent Systems. LNCS, vol. 1032. Springer, Heidelberg (1996)CrossRefGoogle Scholar
  22. 22.
    Lattner, C., Adve, V.: LLVM: a compilation framework for lifelong program analysis transformation. In: International Symposium on Code Generation and Optimization, pp. 75–86 (2004)Google Scholar
  23. 23.
    Jerraya, A.A., Bouchhima, A., Pétrot, F.: Programming models and hw-sw interfaces abstraction for multi-processor soc. In: Proceedings of DAC, pp. 280–285. ACM (2006)Google Scholar
  24. 24.
    Heinen, S., Joost, M.: Firmware development for evolving digital communication technologies. In: Hardware-dependent Software, pp. 151–171. Springer (2009)Google Scholar
  25. 25.
    Bernstein, P.A., Hadzilacos, V., Goodman, N.: Concurrency control and recovery in database systems, vol. 370. Addison-wesley New York (1987)Google Scholar
  26. 26.
    Cornet, J.: Separation of functional and non-functional aspects in transactional level models of systems-on-chip. Grenoble INP Group, PhD thesis (2008)Google Scholar
  27. 27.
    Rose, A., Swan, S., Pierce, J., Fernandez, J.M., et al.: Transaction level modeling in systemc. Open SystemC Initiative 1(1.297) (2005)Google Scholar
  28. 28.
    Blanc, N., Kroening, D.: Race analysis for SystemC using model checking. ACM Transactions on Design Automation of Electronic Systems (TODAES) 15(3), 21 (2010)CrossRefGoogle Scholar
  29. 29.
    Helmstetter, C., Maraninchi, F., Maillet-Contoz, L.: Full simulation coverage for SystemC transaction-level models of systems-on-a-chip. Form. Methods in Systs. Des. 35(2) (2009)Google Scholar
  30. 30.
    Edelstein, O., Farchi, E., Nir, Y., Ratsaby, G., Ur, S.: Multithreaded java program test generation. IBM Systems Journal 41(1), 111–125 (2002)CrossRefGoogle Scholar
  31. 31.
    Ball, T., Cook, B., Levin, V., Rajamani, S.K.: SLAM and static driver verifier: Technology transfer of formal methods inside microsoft. In: Boiten, E.A., Derrick, J., Smith, G.P. (eds.) IFM 2004. LNCS, vol. 2999, pp. 1–20. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  32. 32.
    Qadeer, S., Wu, D.: KISS: keep it simple and sequential. ACM SIGPLAN Notices 39 (2004)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2013

Authors and Affiliations

  1. 1.Princeton UniversityUSA

Personalised recommendations