CyFuzz: A Differential Testing Framework for Cyber-Physical Systems Development Environments

  • Shafiul Azam ChowdhuryEmail author
  • Taylor T. Johnson
  • Christoph Csallner
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10107)


Designing complex systems using graphical models in sophisticated development environments is becoming de-facto engineering practice in the cyber-physical system (CPS) domain. Development environments thrive to eliminate bugs or undefined behaviors in themselves. Formal techniques, while promising, do not yet scale to verifying entire industrial CPS tool chains. A practical alternative, automated random testing, has recently found bugs in CPS tool chain components. In this work we identify problematic components in the Simulink modeling environment, by studying publicly available bug reports. Our main contribution is CyFuzz, the first differential testing framework to find bugs in arbitrary CPS development environments. Our automated model generator does not require a formal specification of the modeling language. We present prototype implementation for testing Simulink, which found interesting issues and reproduced one bug which MathWorks fixed in subsequent product releases. We are working on implementing a full-fledged generator with sophisticated model-creation capabilities.


Differential testing Cyber-physical systems Model-based design Simulink 



This material is based upon work supported by the National Science Foundation under Grants No. 1117369, 1464311, and 1527398, by Air Force Office of Scientific Research (AFOSR) contract numbers FA9550-15-1-0258 and FA9550-16-1-0246, and by Air Force Research Lab (AFRL) contract number FA8750-15-1-0105. Any opinions, findings, and conclusions or recommendations expressed in this publication are those of the authors and do not necessarily reflect the views of AFRL, AFOSR, or NSF.


  1. 1.
    Stürmer, I., Conrad, M., Dörr, H., Pepper, P.: Systematic testing of model-based code generators. IEEE Trans. Softw. Eng. (TSE) 33(9), 622–634 (2007)CrossRefGoogle Scholar
  2. 2.
    Lee, E.A., Seshia, S.A.: Introduction to Embedded Systems: A Cyber-Physical Systems Approach, 1st edn (2011).
  3. 3.
    Beizer, B.: Software Testing Techniques, 2nd edn. Van Nostrand Reinhold, New York (1990)zbMATHGoogle Scholar
  4. 4.
    U.S. National Institute of Standards and Technology (NIST): The economic impacts of inadequate infrastructure for software testing: planning report 02-3 (May 2002)Google Scholar
  5. 5.
    U.S. Consumer Product Safety Commission (CPSC): Recall 11-702: fire alarm control panels recalled by fire-lite alarms due to alert failure, October 2010.
  6. 6.
    U.S. National Highway Traffic Safety Administration (NHTSA): Defect information report 14V-053, February 2014.
  7. 7.
    Alemzadeh, H., Iyer, R.K., Kalbarczyk, Z., Raman, J.: Analysis of safety-critical computer failures in medical devices. IEEE Secur. Priv. 11(4), 14–26 (2013)CrossRefGoogle Scholar
  8. 8.
    Johnson, T.T., Bak, S., Drager, S.: Cyber-physical specification mismatch identification with dynamic analysis. In: Proceedings of ACM/IEEE Sixth International Conference on Cyber-Physical Systems (ICCPS), pp. 208–217. ACM, April 2015Google Scholar
  9. 9.
    Cuoq, P., Monate, B., Pacalet, A., Prevosto, V., Regehr, J., Yakobowski, B., Yang, X.: Testing static analyzers with randomly generated programs. In: Goodloe, A.E., Person, S. (eds.) NFM 2012. LNCS, vol. 7226, pp. 120–125. Springer, Heidelberg (2012). doi: 10.1007/978-3-642-28891-3_12 CrossRefGoogle Scholar
  10. 10.
    Yang, X., Chen, Y., Eide, E., Regehr, J.: Finding and understanding bugs in C compilers. In: Proceedings of 32nd ACM SIGPLAN Conference on Programming Language Design and Implementation (PLDI), pp. 283–294. ACM, June 2011Google Scholar
  11. 11.
    Dewey, K., Roesch, J., Hardekopf, B.: Fuzzing the Rust typechecker using CLP (T). In: Proceedings of 30th IEEE/ACM International Conference on Automated Software Engineering (ASE), pp. 482–493. IEEE (2015)Google Scholar
  12. 12.
    McKeeman, W.M.: Differential testing for software. Digit. Tech. J. 10(1), 100–107 (1998)Google Scholar
  13. 13.
    Lidbury, C., Lascu, A., Chong, N., Donaldson, A.F.: Many-core compiler fuzzing. In: Proceedings of 36th ACM SIGPLAN Conference on Programming Language Design and Implementation (PLDI), pp. 65–76. ACM, June 2015Google Scholar
  14. 14.
    Holler, C., Herzig, K., Zeller, A.: Fuzzing with code fragments. In: Proceedings of 21th USENIX Security Symposium, pp. 445–458. USENIX Association, August 2012Google Scholar
  15. 15.
    The MathWorks Inc.: Products and services (2016).
  16. 16.
    Hamon, G., Rushby, J.: An operational semantics for Stateflow. Int. J. Softw. Tools Technol. Transf. 9(5), 447–456 (2007)CrossRefzbMATHGoogle Scholar
  17. 17.
    Bouissou, O., Chapoutot, A.: An operational semantics for Simulink’s simulation engine. In: Proceedings of 13th ACM SIGPLAN/SIGBED International Conference on Languages, Compilers, Tools and Theory for Embedded Systems (LCTES), pp. 129–138. ACM, June 2012Google Scholar
  18. 18.
    Matinnejad, R., Nejati, S., Briand, L.C., Bruckmann, T.: SimCoTest: a test suite generation tool for Simulink/Stateflow controllers. In: Proceedings of 38th International Conference on Software Engineering (ICSE), pp. 585–588. ACM, May 2016Google Scholar
  19. 19.
    Sridhar, A., Srinivasulu, D., Mohapatra, D.P.: Model-based test-case generation for Simulink/Stateflow using dependency graph approach. In: Proceedings of 3rd IEEE International Advance Computing Conference (IACC), pp. 1414–1419, February 2013Google Scholar
  20. 20.
    National Instruments: Labview system design software (2016).
  21. 21.
    The MathWorks Inc.: Simulation documentation (2016).
  22. 22.
    Rajeev, A.C., Sampath, P., Shashidhar, K.C., Ramesh, S.: CoGenTe: a tool for code generator testing. In: Proceedings of 25th IEEE/ACM International Conference on Automated Software Engineering (ASE), pp. 349–350. ACM, September 2010Google Scholar
  23. 23.
    Goldberg, D.: What every computer scientist should know about floating-point arithmetic. ACM Comput. Surv. 23(1), 5–48 (1991)CrossRefGoogle Scholar
  24. 24.
    Goldberg, D.E.: Genetic Algorithms in Search, Optimization and Machine Learning, 1st edn. Addison-Wesley, Boston (1989)zbMATHGoogle Scholar
  25. 25.
    Nguyen, L.V., Schilling, C., Bogomolov, S., Johnson, T.T.: Runtime verification of model-based development environments. In: Proceedings of 15th International Conference on Runtime Verification (RV), September 2015Google Scholar
  26. 26.
    Girard, A., Julius, A.A., Pappas, G.J.: Approximate simulation relations for hybrid systems. Discret. Event Dyn. Syst. 18(2), 163–179 (2008)MathSciNetCrossRefzbMATHGoogle Scholar
  27. 27.
    Stürmer, I., Conrad, M.: Test suite design for code generation tools. In: Proceedings of 18th IEEE International Conference on Automated Software Engineering (ASE), pp. 286–290, October 2003Google Scholar
  28. 28.
    Csallner, C., Smaragdakis, Y.: JCrasher: an automatic robustness tester for Java. Softw. Pract. Exp. 34(11), 1025–1050 (2004)CrossRefGoogle Scholar
  29. 29.
    Hussain, I., Csallner, C., Grechanik, M., Xie, Q., Park, S., Taneja, K., Hossain, B.M.: RUGRAT: evaluating program analysis and testing tools and compilers with large generated random benchmark applications. Softw. Pract. Exp. 46(3), 405–431 (2016)CrossRefGoogle Scholar
  30. 30.
    Nguyen, L.V., Schilling, C., Bogomolov, S., Johnson, T.T.: HyRG: a random generation tool for affine hybrid automata. In: Proceedings of 18th International Conference on Hybrid Systems: Computation and Control (HSCC), pp. 289–290. ACM, April 2015Google Scholar
  31. 31.
    Sampath, P., Rajeev, A.C., Ramesh, S., Shashidhar, K.C.: Testing model-processing tools for embedded systems. In: Proceedings of 13th IEEE Real-Time and Embedded Technology and Applications Symposium, pp. 203–214. IEEE, April 2007Google Scholar
  32. 32.
    Mohaqeqi, M., Mousavi, M.R.: Sound test-suites for cyber-physical systems. In: 10th International Symposium on Theoretical Aspects of Software Engineering (TASE), pp. 42–48, July 2016Google Scholar
  33. 33.
    Kanade, A., Alur, R., Ivančić, F., Ramesh, S., Sankaranarayanan, S., Shashidhar, K.C.: Generating and analyzing symbolic traces of Simulink/Stateflow models. In: Bouajjani, A., Maler, O. (eds.) CAV 2009. LNCS, vol. 5643, pp. 430–445. Springer, Heidelberg (2009). doi: 10.1007/978-3-642-02658-4_33 CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Shafiul Azam Chowdhury
    • 1
    Email author
  • Taylor T. Johnson
    • 1
  • Christoph Csallner
    • 1
  1. 1.The University of Texas at ArlingtonArlingtonUSA

Personalised recommendations