In the past decade, Runtime Verification (RV) has gained much focus, from both the research community and practitioners. Roughly speaking, RV combines a set of theories, techniques and tools aiming towards efficient analysis of systems’ executions and guaranteeing their correctness using monitoring techniques. Major challenges in RV include characterizing and formally expressing requirements that can be monitored, proposing intuitive and concise specification formalisms, and monitoring specifications efficiently (time and memory-wise).

With the major strides made in recent years, much effort is still needed to make RV an attractive and viable methodology for industrial use. In addition, further studies are needed to apply RV to wider application domains such as security, bio-health, power micro-grids.

The purpose of the “Runtime Verification: the application perspective” track at ISoLA’12 was to bring together experts on runtime verification and potential application domains to try and advance the state-of-the-art on how to make RV more attractive to industry and usable in additional application domains. This introductory paper proposes an overview of the contributions brought by the papers selected at the track.


Model Check Temporal Logic Application Domain Smart Grid Parametric Property 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Ahrendt, W., Pace, G., Schneider, G.: A unified approach for static and runtime verification: framework and applications. In: Margaria, Steffen [22]Google Scholar
  2. 2.
    Allan, C., Avgustinov, P., Christensen, A.S., Hendren, L.J., Kuzins, S., Lhoták, O., de Moor, O., Sereni, D., Sittampalam, G., Tibble, J.: Adding trace matching with free variables to AspectJ. In: Johnson, R.E., Gabriel, R.P. (eds.) OOPSLA, pp. 345–364. ACM (2005)Google Scholar
  3. 3.
    Barringer, H., Falcone, Y., Havelund, K., Reger, G., Rydeheard, D.: Quantified Event Automata: Towards Expressive and Efficient Runtime Monitors. In: Giannakopoulou, D., Méry, D. (eds.) FM 2012. LNCS, vol. 7436, pp. 68–84. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  4. 4.
    Barringer, H., Havelund, K.: TraceContract: A Scala DSL for Trace Analysis. In: Butler, M., Schulte, W. (eds.) FM 2011. LNCS, vol. 6664, pp. 57–72. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  5. 5.
    Barringer, H., Rydeheard, D.E., Havelund, K.: Rule systems for run-time monitoring: from Eagle to RuleR. J. Log. Comput. 20(3), 675–706 (2010)MathSciNetzbMATHCrossRefGoogle Scholar
  6. 6.
    Bensalem, S., Bozga, M., Delahaye, B., Jegourel, C., Legay, A., Nouri, A.: Sbip: A statistical model checking extension for BIP. In: Margaria, Steffen [22]Google Scholar
  7. 7.
    Blech, J.O., Falcone, Y., Rueß, H., Schäetz, B.: Behavioral Specification based Runtime Monitors for OSGi Services. In: Margaria, Steffen [22]Google Scholar
  8. 8.
    Bodden, E., Hendren, L.J.: The Clara framework for hybrid typestate analysis. STTT 14(3), 307–326 (2012)CrossRefGoogle Scholar
  9. 9.
    Chen, F., Roşu, G.: Parametric Trace Slicing and Monitoring. In: Kowalewski, S., Philippou, A. (eds.) TACAS 2009. LNCS, vol. 5505, pp. 246–261. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  10. 10.
    David, A., Larsen, K.G., Legay, A., Seadwards, S., Poulsen, D.: Systems biology, runtime verification and more. In: Margaria, Steffen [22]Google Scholar
  11. 11.
    Dimitrova, R., Finkbeiner, B., Rabe, M.: Monitoring temporal information flow. In: Margaria, Steffen [22]Google Scholar
  12. 12.
    Dormoy, J., Kouchnarenko, O., Lanoix, A.: Using Temporal Logic for Dynamic Reconfigurations of Components. In: Barbosa, L.S., Lumpe, M. (eds.) FACS 2010. LNCS, vol. 6921, pp. 200–217. Springer, Heidelberg (2010)Google Scholar
  13. 13.
    Falcone, Y., Jaber, M., Nguyen, T.-H., Bozga, M., Bensalem, S.: Runtime Verification of Component-Based Systems. In: Barthe, G., Pardo, A., Schneider, G. (eds.) SEFM 2011. LNCS, vol. 7041, pp. 204–220. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  14. 14.
    Hallé, S., Tremblay-Lessard, R.: A case for “piggyback” runtime monitoring. In: Margaria, Steffen [22]Google Scholar
  15. 15.
    Hartmanns, A., Hermanns, H.: Modelling and decentralised runtime control of self-stabilising power micro grids. In: Margaria, Steffen [22]Google Scholar
  16. 16.
    Havelund, K.: What does AI have to do with RV. In: Margaria, Steffen [22]Google Scholar
  17. 17.
    Havelund, K., Goldberg, A.: Verify Your Runs. In: Meyer, B., Woodcock, J. (eds.) VSTTE 2005. LNCS, vol. 4171, pp. 374–383. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  18. 18.
    Hinrichs, T., Sistla, P.A., Zuck, L.D.: Model checking meets run-time verification. In: Voronkov, A., Korovina, M. (eds.) HOWARD-60: Proceedings of the Higher-Order Workshop on Automated Runtime verification and Debugging (to appear, 2012)Google Scholar
  19. 19.
    Huang, X., Seyster, J., Callanan, S., Dixit, K., Grosu, R., Smolka, S.A., Stoller, S.D., Zadok, E.: Software monitoring with controllable overhead. STTT 14(3), 327–347 (2012)CrossRefGoogle Scholar
  20. 20.
    Kim, C.H.P., Bodden, E., Batory, D., Khurshid, S.: Reducing Configurations to Monitor in a Software Product Line. In: Barringer, H., Falcone, Y., Finkbeiner, B., Havelund, K., Lee, I., Pace, G., Roşu, G., Sokolsky, O., Tillmann, N. (eds.) RV 2010. LNCS, vol. 6418, pp. 285–299. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  21. 21.
    Leucker, M., Schallhart, C.: A brief account of runtime verification. Journal of Logic and Algebraic Programming 78(5), 293–303 (2008)CrossRefGoogle Scholar
  22. 22.
    Margaria, T., Steffen, B.: ISoLA 2012, Part I. LNCS, vol. 7609. Springer, Heidelberg (2012)Google Scholar
  23. 23.
    Mounier, L., Sifakis, E.: Dynamic information-flow analysis for multi-threaded applications. In: Margaria, Steffen [22]Google Scholar
  24. 24.
    Pnueli, A., Zaks, A.: PSL Model Checking and Run-Time Verification Via Testers. In: Misra, J., Nipkow, T., Sekerinski, E. (eds.) FM 2006. LNCS, vol. 4085, pp. 573–586. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  25. 25.
    Razavi, N., Holzer, A., Farzan, A.: Bounded-interference sequentialization for testing concurrent programs. In: Margaria, Steffen [22]Google Scholar
  26. 26.
    Runtime Verification (2001-2012),
  27. 27.
    Sabelfeld, A., Myers, A.C.: Language-based information-flow security. IEEE Journal on Selected Areas in Communications 21(1), 5–19 (2003)CrossRefGoogle Scholar
  28. 28.
    Stolz, V., Bodden, E.: Temporal assertions using AspectJ. Electr. Notes Theor. Comput. Sci. 144(4), 109–124 (2006)CrossRefGoogle Scholar
  29. 29.
    Terauchi, T., Aiken, A.: Secure Information Flow as a Safety Problem. In: Hankin, C., Siveroni, I. (eds.) SAS 2005. LNCS, vol. 3672, pp. 352–367. Springer, Heidelberg (2005)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  1. 1.Laboratoire d’Informatique de GrenobleUniversity of Grenoble I (UJF)France
  2. 2.University of Illinois at ChicagoUSA

Personalised recommendations