Optimization Letters

, Volume 5, Issue 3, pp 479–490 | Cite as

Identifying algorithmic vulnerabilities through simulated annealing

  • S. Andrew Johnson
  • Dinesh P. Mehta
  • Ramakrishna Thurimella
Original Paper
  • 69 Downloads

Abstract

Real-time software systems with tight performance requirements are abundant. These systems frequently use many different algorithms and if any one of these algorithms was to experience behavior that is atypical because of the input, the entire system may not be able to meet its performance requirements. Unfortunately, it is algorithmically intractable, if not unsolvable, to find the inputs which would cause worst-case behavior. If inputs can be identified that make the system take, say, ten times longer compared to the time it usually takes, that information is valuable for some systems. In this paper, we present a method for finding inputs that perform much worse than the average input to different algorithms. We use the simulated annealing heuristic search method and show that this method is successful in finding worst-case inputs to several sorting algorithms, using several measures of an algorithm’s runtime.

Keywords

Simulated annealing Worst-case execution time Algorithm Sorting 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Alazamir S., Rebennack S., Pardalos P.M.: Improving the neighborhood selection strategy in simulated annealing using optimal stopping problem. In: Tan, C.-M. (ed.) Global Optimization: Focus on Simulated Annealing. Energy Systems, pp. 363–382. I-Tech Education and Publication, Vienna (2008)Google Scholar
  2. 2.
    Crosby, S. A., Wallach, D. S.: Denial of service via algorithmic complexity attacks. In: Proceedings of the USENIX Security Symposium, Washington, DC. vol. 12, pp. 29–44 (2003)Google Scholar
  3. 3.
    Gray, N.: A Beginners C++, University of Wollongong (2006)Google Scholar
  4. 4.
    Gross H.-G.: A prediction system for evolutionary testability applied to dynamic execution time analysis. Inf. Softw. Technol. 43(14), 855–862 (2001)CrossRefGoogle Scholar
  5. 5.
    Harman, M.: The current state and future of search based software engineering. In: 2007 Future of Software Engineering, Washington, DC, USA. FOSE ’07, pp. 342–357. IEEE Computer Society, Los Alamitos (2007)Google Scholar
  6. 6.
    Jones, B., Sthamer, H., Yang, X., Eyres, D.: The automatic generation of software test data sets using adaptive search techniques. In: Proceedings of the 3rd International Conference on Software Quality Management. pp. 435–444 (1995)Google Scholar
  7. 7.
    Josuttis N.M.: The C++ Standard Library, A Tutorial and Reference. Addison–Wesley, New York (1999)Google Scholar
  8. 8.
    Kirkpatrick S., Gelatt C., Vecchi M.P.: Optimization by simulated annealing. Science, New Series 220(4598), 671–680 (1983)MathSciNetGoogle Scholar
  9. 9.
    Mavridou T., Pardalos P.M.: Simulated annealing and genetic algorithms for the facility layout problem: a survey. Comput. Optim. Appl. 7, 111–126 (1997)MATHCrossRefMathSciNetGoogle Scholar
  10. 10.
    McIlroy M.D.: A killer adversary for quicksort. Softw. Pract. Exp. 29(0), 1–4 (1999)Google Scholar
  11. 11.
    McMinn P.: Search-based software test data generation: a survey: Research articles. Softw. Test. Verif. Reliab. 14, 105–156 (2004)CrossRefGoogle Scholar
  12. 12.
    Pardalos, P.M., Resende, M.G.C. (eds): Handbook of Applied Optimization. Oxford University Press, Oxford (2002)MATHGoogle Scholar
  13. 13.
  14. 14.
    Puschner, P., Nossal, R.: Testing the results of static worst-case execution-time analysis. In: Proceedings of the IEEE Real-Time Systems Symposium, Washington, DC, USA. RTSS ’98, pp. 134–143. IEEE Computer Society, Los Alamitos (1998)Google Scholar
  15. 15.
    Reeves C.R.: Modern Heuristic Techniques for Combinatorial Problems. McGraw-Hill, New York (1995)Google Scholar
  16. 16.
    Tracey, N.: A Search-Based Automated Test-Data Generation Framework for Safety Critical Software. PhD thesis, University of York, York (2000)Google Scholar
  17. 17.
    Tracey, N., Clark, J., Mander, K.: Automated program flaw finding using simulated annealing. In: In the proceedings of the ACM/SIGSOFT International Symposium on Software Testing and Analysis (ISSTA). pp. 73–81 (1998)Google Scholar
  18. 18.
    Tracey, N., Clark, J., Mander, K.: The way forward for unifying dynamic test case generation: The optimisation-based approach. In: In International Workshop on Dependable Computing and Its Applications. pp. 169–180 (1998)Google Scholar
  19. 19.
    Tracey N., Clark J., Mcdermid J.: Automated test-data generation for exception conditions. Softw. Pract. Exp. 30, 61–79 (2000)CrossRefGoogle Scholar
  20. 20.
    Wegener, J., Grimm, K., Grochtmann, M., Sthamer, H., Jones, B.: Systematic testing of real-time systems. In: Proceedings of the 4th European International Conference on Software Testing, Analysis & Review EuroSTAR (1996)Google Scholar
  21. 21.
    Wegener, J., Pohlheim, H., Sthamer, H.: Testing the temporal behavior of real-time tasks using extended evolutionary algorithms. In: Proceedings of the 7th European Conference on Software Testing, Analysis and Review (EuroSTAR 1999) (1999)Google Scholar

Copyright information

© Springer-Verlag 2011

Authors and Affiliations

  • S. Andrew Johnson
    • 1
  • Dinesh P. Mehta
    • 1
  • Ramakrishna Thurimella
    • 2
  1. 1.Department of Mathematical and Computer SciencesColorado School of MinesGoldenUSA
  2. 2.Department of Computer ScienceUniversity of DenverDenverUSA

Personalised recommendations