Hierarchical Non-intrusive In-situ Requirements Monitoring for Embedded Systems

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10548)


Accounting for all operating conditions of a system at the design stage is typically infeasible for complex systems. In-situ runtime monitoring and verification can enable a system to introspectively ensure the system is operating correctly in the presence of dynamic environment, to rapidly detect failures, and to provide detailed execution traces to find the root cause thereof. In this paper, we seek to address two challenges faced in using in-situ runtime verification for embedded systems, including (1) efficiently defining and automatically constructing a requirements model for embedded system software and (2) minimizing the runtime overhead of observing and verifying the runtime execution adheres to the requirements model. We present a methodology to construct a hierarchical runtime monitoring graph from system requirements specified using multiple UML sequence diagrams, which are already commonly used in software development. We further present the design of on-chip hardware that nonintrusively monitors the system at runtime to ensure the execution matches the requirements model. We evaluate the proposed methodology using a case study of a fail-safe autonomous vehicle subsystem and analyze the relationship between event coverage, detection rate, and hardware requirements.


Runtime requirement monitoring Embedded systems Nonintrusive system monitoring 


  1. 1.
    Alur, R.: Timed automata. In: Halbwachs, N., Peled, D. (eds.) CAV 1999. LNCS, vol. 1633, pp. 8–22. Springer, Heidelberg (1999). doi: 10.1007/3-540-48683-6_3 CrossRefGoogle Scholar
  2. 2.
    Backasch, R., et al.: Runtime verification for multicore SoC with high-quality trace data. ACM Trans. Des. Autom. Electron. Syst. 18(2), 18 (2013)CrossRefGoogle Scholar
  3. 3.
    Bonakdarpour, B., Navabpour, S., Fischmeister, S.: Sampling-based runtime verification. In: Butler, M., Schulte, W. (eds.) FM 2011. LNCS, vol. 6664, pp. 88–102. Springer, Heidelberg (2011). doi: 10.1007/978-3-642-21437-0_9 CrossRefGoogle Scholar
  4. 4.
    Brill, M., Damm, W., Klose, J., Westphal, B., Wittke, H.: Live sequence charts. In: Ehrig, H., Damm, W., Desel, J., Große-Rhode, M., Reif, W., Schnieder, E., Westkämper, E. (eds.) Integration of Software Specification Techniques for Applications in Engineering. LNCS, vol. 3147, pp. 374–399. Springer, Heidelberg (2004). doi: 10.1007/978-3-540-27863-4_21 CrossRefGoogle Scholar
  5. 5.
    Chai, M., Schlingloff, B.-H.: Monitoring systems with extended live sequence charts. In: Bonakdarpour, B., Smolka, S.A. (eds.) RV 2014. LNCS, vol. 8734, pp. 48–63. Springer, Cham (2014). doi: 10.1007/978-3-319-11164-3_5 Google Scholar
  6. 6.
    David, A.: Hierarchical modeling and analysis of timed systems. IT Tech. Rep. Ser. 2003–50 (2003)Google Scholar
  7. 7.
    Durand, P.J., et al.: An efficient algorithm for similarity analysis of molecules. Internet J. Chem. 2(17), 1–16 (1999)MathSciNetGoogle Scholar
  8. 8.
    Firley, T., Huhn, M., Diethers, K., Gehrke, T., Goltz, U.: Timed sequence diagrams and tool-based analysis — a case study. In: France, R., Rumpe, B. (eds.) UML 1999. LNCS, vol. 1723, pp. 645–660. Springer, Heidelberg (1999). doi: 10.1007/3-540-46852-8_45 CrossRefGoogle Scholar
  9. 9.
    Fryer, R.: FPGA based CPU instrumentation for hard real-time embedded system testing. ACM SIGBED Rev. 2(2), 39–42 (2005)CrossRefGoogle Scholar
  10. 10.
    Hailpern, B., Santhanam, P.: Software debugging, testing, and verification. IBM Syst. J. 41(1), 4–12 (2002)CrossRefGoogle Scholar
  11. 11.
    Hofmann, R., et al.: Distributed performance monitoring: methods, tools, and applications. IEEE Trans. Parallel Distrib. Syst. 5(6), 585–598 (1994)CrossRefGoogle Scholar
  12. 12.
    Jones, M.: What really happened on mars rover pathfinder. Risks Dig. 19(49), 1–2 (1997)Google Scholar
  13. 13.
    Kane, A., Chowdhury, O., Datta, A., Koopman, P.: A case study on runtime monitoring of an Autonomous Research Vehicle (ARV) system. In: Bartocci, E., Majumdar, R. (eds.) RV 2015. LNCS, vol. 9333, pp. 102–117. Springer, Cham (2015). doi: 10.1007/978-3-319-23820-3_7 CrossRefGoogle Scholar
  14. 14.
    Leveson, N.G., Turner, C.S.: An investigation of the Therac-25 accidents. Computer (Long. Beach. Calif) 26(7), 18–41 (1993)Google Scholar
  15. 15.
    Lu, H.: The design and implementation of P2V, an architecture for zero-overhead online verification of software programs. Science (80), August 2007Google Scholar
  16. 16.
    Macaulay, K.: ATSB preliminary factual report, in-flight upset, Qantas Airbus A330, 154 Km West of Learmonth, WA, 7 October 2008. Australian Transport Safety Bureau Media Release, 14 November 2008 (1992)Google Scholar
  17. 17.
    Mijat, R.: Better trace for better software: introducing the new arm coresight system trace macrocell and trace memory controller. ARM, White Pap. (2010)Google Scholar
  18. 18.
    Nassar, A., et al.: NUVA: architectural support for runtime verification of parametric specifications over multicores. In: 2015 International Conference on Compilers, Architecture and Synthesis for Embedded Systems, CASES 2015, pp. 137–146 (2015)Google Scholar
  19. 19.
    Navabpour, S., Bonakdarpour, B., Fischmeister, S.: Time-triggered runtime verification of component-based multi-core systems. In: Bartocci, E., Majumdar, R. (eds.) RV 2015. LNCS, vol. 9333, pp. 153–168. Springer, Cham (2015). doi: 10.1007/978-3-319-23820-3_10 CrossRefGoogle Scholar
  20. 20.
    Pike, L., Niller, S., Wegmann, N.: Runtime verification for ultra-critical systems. In: Khurshid, S., Sen, K. (eds.) RV 2011. LNCS, vol. 7186, pp. 310–324. Springer, Heidelberg (2012). doi: 10.1007/978-3-642-29860-8_23 CrossRefGoogle Scholar
  21. 21.
    Pnueli, A.: The temporal logic of programs. In: 18th Annual Symposium on Foundations of Computer Science (SFCS 1977), pp. 46–57. IEEE (1977)Google Scholar
  22. 22.
    Reinbacher, T., Függer, M., Brauer, J.: Real-time runtime verification on chip. In: Qadeer, S., Tasiran, S. (eds.) RV 2012. LNCS, vol. 7687, pp. 110–125. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-35632-2_13 CrossRefGoogle Scholar
  23. 23.
    Rumbaugh, J., et al.: The Unified Modeling Language Reference Manual. Addison-Wesley, Boston (2005)Google Scholar
  24. 24.
    El Shobaki, M.: On-chip monitoring of single-and multiprocessor hardware real-time operating systems. In: Proceedings of the 8th International Conference on Real-Time Computing Systems and Applications (RTCSA) (2002)Google Scholar
  25. 25.
    Tool, A.V.: SignalTap II Embedded Logic Analyzer (2006)Google Scholar
  26. 26.
    Tool, X.V.: ChipScope Pro (2006)Google Scholar
  27. 27.
    Whalen, M., Cofer, D., Miller, S., Krogh, B.H., Storm, W.: Integration of formal analysis into a model-based software development process. In: Leue, S., Merino, P. (eds.) FMICS 2007. LNCS, vol. 4916, pp. 68–84. Springer, Heidelberg (2008). doi: 10.1007/978-3-540-79707-4_7 CrossRefGoogle Scholar
  28. 28.
    Wilhelm, R., et al.: The worst-case execution-time problem-overview of methods and survey of tools. ACM Trans. Embed. Comput. Syst. 7(3), 1–53 (2008)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Department of Electrical and Computer EngineeringUniversity of ArizonaTucsonUSA

Personalised recommendations