An Analytical and Experimental Comparison of CSP Extensions and Tools

  • Ling Shi
  • Yang Liu
  • Jun Sun
  • Jin Song Dong
  • Gustavo Carvalho
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7635)

Abstract

Communicating Sequential Processes (CSP) has been widely applied to modeling and analyzing concurrent systems. There have been considerable efforts on enhancing CSP by taking data and other system aspects into account. For instance, CSPM combines CSP with a functional programming language whereas CSP# integrates high-level CSP-like process operators with low-level procedure code. Little work has been done to systematically compare these CSP extensions, which may have subtle and substantial differences. In this paper, we compare CSPM and CSP# not only on their syntax, but also operational semantics as well as their supporting tools such as FDR, ProB, and PAT. We conduct extensive experiments to compare the performance of these tools in different settings. Our comparison can be used to guide users to choose the appropriate CSP extension and verification tool based on the system characteristics.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Abrial, J.-R.: The B-book: assigning programs to meanings. Cambridge University Press, New York (1996)MATHCrossRefGoogle Scholar
  2. 2.
    Carvalho, G.H.P., Dias, T., Mota, A., Sampaio, A.: Analytical comparison of refinement checkers. In: SBMF, pp. 61–66 (2011)Google Scholar
  3. 3.
    Hoare, C.: Communicating Sequential Processes. Prentice-Hall (1985)Google Scholar
  4. 4.
    Holzmann, G.: Spin model checker, the: primer and reference manual. Addison-Wesley Professional (2003)Google Scholar
  5. 5.
    Jesus, J., Mota, A., Sampaio, A., Grijo, L.: Architectural Verification of Control Systems Using CSP. In: Qin, S., Qiu, Z. (eds.) ICFEM 2011. LNCS, vol. 6991, pp. 323–339. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  6. 6.
    Leuschel, M., Butler, M.: ProB: A Model Checker for B. In: Araki, K., Gnesi, S., Mandrioli, D. (eds.) FME 2003. LNCS, vol. 2805, pp. 855–874. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  7. 7.
    Leuschel, M., Fontaine, M.: Probing the Depths of CSP-M: A New fdr-Compliant Validation Tool. In: Liu, S., Araki, K. (eds.) ICFEM 2008. LNCS, vol. 5256, pp. 278–297. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  8. 8.
    Leuschel, M., Massart, T., Currie, A.: How to Make FDR Spin LTL Model Checking of CSP by Refinement. In: Oliveira, J.N., Zave, P. (eds.) FME 2001. LNCS, vol. 2021, pp. 99–118. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  9. 9.
    Lowe, G.: Specification of communicating processes: temporal logic versus refusals-based refinement. Form. Asp. Comput. 20(3), 277–294 (2008)MATHCrossRefGoogle Scholar
  10. 10.
    Formal Systems (Europe) Ltd.: Failures-Divergence Refinement - FDR2 User Manual (version 2.91)Google Scholar
  11. 11.
    Murray, T.: On the limits of refinement-testing for model-checking CSP. Form. Asp. Comput., 1–38 (2011)Google Scholar
  12. 12.
    Palikareva, H., Ouaknine, J., Roscoe, A.W.: Faster FDR counterexample generation using SAT-solving. ECEASST 23 (2009)Google Scholar
  13. 13.
    Parashkevov, A.N., Yantchev, J.: ARC - a tool for efficient refinement and equivalence checking for CSP. In: ICA3PP, pp. 68–75 (1996)Google Scholar
  14. 14.
    Roscoe, A.W.: CSP is Expressive Enough for Pi (2010)Google Scholar
  15. 15.
    Roscoe, A.W.: The Theory and Practice of Concurrency. Prentice Hall PTR (1997)Google Scholar
  16. 16.
    Roscoe, A.W.: On the expressive power of CSP refinement. Form. Asp. Comput. 17, 93–112 (2005)MATHCrossRefGoogle Scholar
  17. 17.
    Roscoe, A.W.: Understanding Concurrent Systems. Springer-Verlag New York, Inc. (2010)Google Scholar
  18. 18.
    Roscoe, A.W., Gardiner, P.H.B., Goldsmith, M.H., Hulance, J.R., Jackson, D.M., Scattergood, J.B.: Hierarchical Compression for Model-Checking CSP or How to Check 1020 Dining Philosophers for Deadlock. In: Brinksma, E., Steffen, B., Cleaveland, W.R., Larsen, K.G., Margaria, T. (eds.) TACAS 1995. LNCS, vol. 1019, pp. 133–152. Springer, Heidelberg (1995)CrossRefGoogle Scholar
  19. 19.
    Scattergood, B.: The Semantics and Implementation of Machine-Readable CSP. PhD thesis, University of Oxford (1998)Google Scholar
  20. 20.
    Shapiro, S.S., Wilk, M.B.: An analysis of variance test for normality (complete samples). Biometrika 3(52) (1965)Google Scholar
  21. 21.
    Shi, L.: An Analytical and Experimental Comparison of CSP Extensions and Tools. Technical report, NUS (2012), http://www.comp.nus.edu.sg/~pat/compare
  22. 22.
    Sun, J., Liu, Y., Dong, J.S., Chen, C.: Integrating specification and programs for system modeling and verification. In: TASE, pp. 127–135 (2009)Google Scholar
  23. 23.
    Sun, J., Liu, Y., Dong, J.S., Pang, J.: PAT: Towards Flexible Verification under Fairness. In: Bouajjani, A., Maler, O. (eds.) CAV 2009. LNCS, vol. 5643, pp. 709–714. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  24. 24.
    Sun, J., Liu, Y., Roychoudhury, A., Liu, S., Dong, J.S.: Fair Model Checking with Process Counter Abstraction. In: Cavalcanti, A., Dams, D.R. (eds.) FM 2009. LNCS, vol. 5850, pp. 123–139. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  25. 25.
    Woodcock, J., Larsen, P.G., Bicarregui, J., Fitzgerald, J.S.: Formal methods: Practice and experience. ACM Comput. Surv. 41(4) (2009)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Ling Shi
    • 1
  • Yang Liu
    • 2
  • Jun Sun
    • 3
  • Jin Song Dong
    • 1
  • Gustavo Carvalho
    • 4
  1. 1.SoCNational Univ. of SingaporeSingapore
  2. 2.Temasek LabNational Univ. of SingaporeSingapore
  3. 3.ISTDSingapore Univ. of Technology and DesignSingapore
  4. 4.Centro de InformáticaUFPEBrazil

Personalised recommendations