Advertisement

Grade/CPN: A Tool and Temporal Logic for Testing Colored Petri Net Models in Teaching

  • Michael Westergaard
  • Dirk Fahland
  • Christian Stahl
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8100)

Abstract

Grading dozens of Petri net models manually is a tedious and error-prone task. In this paper, we present Grade/CPN, a tool supporting the grading of Colored Petri nets modeled in CPN Tools. The tool is extensible, configurable, and can check static and dynamic properties. It automatically handles tedious tasks like checking that good modeling practise is adhered to, and supports tasks that are difficult to automate, such as checking model legibility. We propose and support the Britney Temporal Logic which can be used to guide the simulator and to check temporal properties. We provide our experiences with using the tool in a course with 100 participants.

Keywords

Model Check Temporal Logic Atomic Proposition Coverage Criterion Kripke Structure 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Jensen, K., Kristensen, L.M.: Coloured Petri Nets – Modelling and Validation of Concurrent Systems. Springer (2009)Google Scholar
  2. 2.
    van der Aalst, W.M.P., Stahl, C.: Modeling Business Processes – A Petri Net-Oriented Approach. MIT Press (2011)Google Scholar
  3. 3.
    Online: CPN Tools webpage, http://cpntools.org
  4. 4.
    Westergaard, M., Kristensen, L.M.: The Access/CPN Framework: A Tool for Interacting with the CPN Tools Simulator. In: Franceschinis, G., Wolf, K. (eds.) PETRI NETS 2009. LNCS, vol. 5606, pp. 313–322. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  5. 5.
    Westergaard, M., Fahland, D., Stahl, C.: Grade/CPN: Semi-automatic Support for Teaching Petri Nets by Checking Many Petri Nets Against One Specification. In: Proc. of PNSE. CEUR Workshop Proceedings, vol. 851, pp. 32–46. CEUR-WS.org (2012)Google Scholar
  6. 6.
    Westergaard, M., Evangelista, S., Kristensen, L.M.: ASAP: An Extensible Platform for State Space Analysis. In: Franceschinis, G., Wolf, K. (eds.) PETRI NETS 2009. LNCS, vol. 5606, pp. 303–312. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  7. 7.
    Pnueli, A.: The Temporal Logic of Programs. In: Proc. of SFCS 1977, pp. 46–57. IEEE Comp. Soc. (1977)Google Scholar
  8. 8.
    Kripke, S.A.: A semantical analysis of modal logic: I. Normal modal propositional calculi. Zeitschrift fŭr Mathematische Logic und Grundlagen der Mathematik 9, 67–96 (1963)MathSciNetzbMATHCrossRefGoogle Scholar
  9. 9.
    Plotkin, G.: A Structural Approach to Operational Semantics. DAIMI-FN 19, Department of Computer Science, University of Aarhus (1981)Google Scholar
  10. 10.
    Giannakopoulou, D., Havelund, K.: Automata-Based Verification of Temporal Properties on Running Programs. In: Proc. ASE, pp. 412–416. IEEE Computer Society (2001)Google Scholar
  11. 11.
    Utting, M., Legeard, B.: Practical Model-Based Testing: A Tools Approach. Morgan Kaufmann Publishers (2006)Google Scholar
  12. 12.
    Weißleder, S.: Simulated satisfaction of coverage criteria on uml state machines. In: ICST 2012, pp. 117–126 (2010)Google Scholar
  13. 13.
    Bauer, A., Leucker, M., Schallhart, C.: Comparing LTL Semantics for Runtime Verification. Logic and Computation 20(3), 651–674 (2010)MathSciNetzbMATHCrossRefGoogle Scholar
  14. 14.
    Online: JUnit webpage, http://junit.org
  15. 15.
    Online: Jenkins Continuous Integration webpage, http://jenkins-ci.org
  16. 16.
    Ihantola, P., Ahoniemi, T., Karavirta, V., Seppälä, O.: Review of Recent Systems for Automatic Assessment of Programming Assignments. In: Proc. International Conference on Computing Education Research, pp. 86–93. ACM (2010)Google Scholar
  17. 17.
    Verhoeff, T.: Programming Task Packages: Peach Exchange Format. Olympiads in Informnatics 2, 192–207 (2008)Google Scholar
  18. 18.
    Damm, W., Harel, D.: LSCs: Breathing Life into Message Sequence Charts. Form. Methods Syst. Des. 19(1), 45–80 (2001)zbMATHCrossRefGoogle Scholar
  19. 19.
    Clarke, E., Grumberg, O., Jha, S., Lu, Y., Veith, H.: Counterexample-Guided Abstraction Refinement for Symbolic Model Checking. J. ACM 50, 752–794 (2003)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Edelkamp, S., Lluch Lafuente, A., Leue, S.: Directed Explicit Model Checking with HFS-SPIN. In: Dwyer, M.B. (ed.) SPIN 2001. LNCS, vol. 2057, pp. 57–79. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  21. 21.
    ISO/IEC: Software and system engineering – High-level Petri nets – Part 2: Transfer format. ISO/IEC 15909-2:2011Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Michael Westergaard
    • 1
    • 2
  • Dirk Fahland
    • 1
  • Christian Stahl
    • 1
  1. 1.Department of Mathematics and Computer ScienceEindhoven University of TechnologyThe Netherlands
  2. 2.Higher School of EconomicsNational Research UniversityMoscowRussia

Personalised recommendations