Journal of Global Optimization

, Volume 59, Issue 2–3, pp 259–275 | Cite as

PAVER 2.0: an open source environment for automated performance analysis of benchmarking data

  • Michael R. Bussieck
  • Steven P. Dirkse
  • Stefan Vigerske


In this paper we describe PAVER 2.0, an environment (i.e. a process and a suite of tools supporting that process) for the automated performance analysis of benchmarking data. This new environment improves on its predecessor by addressing some of the shortcomings of the original PAVER (Bussieck et al. in Global optimization and constraint satisfaction, lecture notes in computer science, vol 2861, pp 223–238. Springer, Berlin, 2003. doi: 10.1007/978-3-540-39901-8_17) and extending its capabilities. The changes serve to further the original goals of PAVER (automation of the visualization and summarization of benchmarking data) while making the environment more accessible for the use of and modification by the entire community of potential users. In particular, we have targeted the end-users of optimization software, as they are best able to make the many subjective choices necessary to produce impactful results when benchmarking optimization software. We illustrate with some sample analyses conducted via PAVER 2.0.


Benchmarking Performance data analysis Performance metrics 


  1. 1.
    Achterberg, T.: Constraint Integer Programming. Ph.D. thesis, TU Berlin (2007).
  2. 2.
    Achterberg, T.: Benchmarking a MIP Solver. Talk in CPAIOR Master Class (2010).
  3. 3.
    Berthold, T.: Measuring the impact of primal heuristics. Oper. Res. Lett. 41(6), 611–614 (2013). doi: 10.1016/j.orl.2013.08.007
  4. 4.
    Billups, S.C., Dirkse, S.P., Ferris, M.C.: A comparison of large scale mixed complementarity problem solvers. Comput. Optim. Appl. 7(1), 3–25 (1997). doi: 10.1023/A:1008632215341 CrossRefGoogle Scholar
  5. 5.
    Bussieck, M.R., Drud, A.S., Meeraus, A.: MINLPLib—a collection of test models for mixed-integer nonlinear programming. INFORMS J. Comput. 15(1), 114–119 (2003). doi: 10.1287/ijoc. CrossRefGoogle Scholar
  6. 6.
    Bussieck, M.R., Drud, A.S., Meeraus, A., Pruessner, A.: Quality assurance and global optimization. In: Bliek, C., Jermann, C., Neumaier, A. (eds.) Global Optimization and Constraint Satisfaction, Lecture Notes in Computer Science, vol. 2861, pp. 223–238. Springer, Berlin (2003). doi: 10.1007/978-3-540-39901-8_17
  7. 7.
    Crowder, H., Dembo, R.S., Mulvey, J.M.: On reporting computational experiments with mathematical software. ACM Trans. Math. Softw. 5(2), 193–203 (1979). doi: 10.1145/355826.355833 CrossRefGoogle Scholar
  8. 8.
    Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002). doi: 10.1007/s101070100263 CrossRefGoogle Scholar
  9. 9.
    Dolan, E.D., Moré, J.J., Munson, T.S.: Benchmarking Optimization Software with COPS 3.0. Tech. Rep. ANL/MCS-273, Mathematics and Computer Science Division, Argonne National Laboratory (2004).
  10. 10.
    Dolan, E.D., Moré, J.J., Munson, T.S.: Optimality measures for performance profiles. SIAM J. Optim. 16(3), 891–909 (2006). doi: 10.1137/040608015 CrossRefGoogle Scholar
  11. 11.
    Drud, A.S.: Testing and Tuning a New Solver Version Using Performance Tests. INFORMS 2002, San Jose, Session on ’Benchmarking and performance testing of optimization software’. Accessed 15 May 2013
  12. 12.
    Exler, O., Lehmann, T., Schittkowski, K.: A comparative study of SQP-type algorithms for nonlinear and nonconvex mixed-integer optimization. Math. Program. Comput. 4(4), 383–412 (2012). doi: 10.1007/s12532-012-0045-0 CrossRefGoogle Scholar
  13. 13.
    GAMS Development: GAMS/Examiner, User’s Manual (2013). Accessed 8 May 2013
  14. 14.
    Granlund, T.: The GMP Development Team: GNU MP: The GNU Multiple Precision Arithmetic Library (2012).
  15. 15.
    Hendel, G.: PyEvalGui - GUI Components to Facilitate Evaluation of SCIP and Other Solving Software (2013, in development)Google Scholar
  16. 16.
    Koch, T., Achterberg, T., Andersen, E., Bastert, O., Berthold, T., Bixby, R.E., Danna, E., Gamrath, G., Gleixner, A.M., Heinz, S., Lodi, A., Mittelmann, H., Ralphs, T., Salvagnin, D., Steffy, D.E., Wolter, K.: MIPLIB 2010—mixed integer programming library version 5. Math. Program. Comput. 3(2), 103–163 (2011). doi: 10.1007/s12532-011-0025-9 CrossRefGoogle Scholar
  17. 17.
    Mahajan, A., Leyffer, S., Kirches, C.: Solving Mixed-Integer Nonlinear Programs by QP-Diving. Preprint ANL/MCS-P2071-0312, Argonne National Laboratory (2012).
  18. 18.
    Meeraus, A.: Globallib (2013). Accessed 8 May 2013
  19. 19.
    Mittelmann, H.D.: An independent benchmarking of SDP and SOCP solvers. Math. Program. 95(2), 407–430 (2003). doi: 10.1007/s10107-002-0355-5 CrossRefGoogle Scholar
  20. 20.
    Mittelmann, H.D.: DTOS—a service for the optimization community. SIAG/OPT Views-and-News 18, 17–20 (2007)Google Scholar
  21. 21.
    Mittelmann, H.D.: Decision Tree for Optimization Software (2013). Accessed 8 May 2013
  22. 22.
    Mittelmann, H.D., Pruessner, A.: A server for automated performance analysis of benchmarking data. Optim. Methods Softw. 21(1), 105–120 (2006). doi: 10.1080/10556780500065366 CrossRefGoogle Scholar
  23. 23.
    SCIP Development Team: How to Run Automated Tests with SCIP.
  24. 24.
    Why open source? (2013). Accessed 15 May 2013
  25. 25.
    Wikiquote: Winston churchill—wikiquote (2013). Accessed 8 May 2013

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  • Michael R. Bussieck
    • 1
  • Steven P. Dirkse
    • 1
  • Stefan Vigerske
    • 1
  1. 1.GAMS Development Corp.WashingtonUSA

Personalised recommendations