Advertisement

An Approach of Performance Evaluation in Authentic Database Applications

  • Xiaojun Ye
  • Jingmin Xie
  • Jianmin Wang
  • Hao Tang
  • Naiqiao Du
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5895)

Abstract

This paper proposes a benchmark test management framework (BTMF) to simulate realistic database application environments based on TPC benchmarks. BTMF provides configuration parameters for both test system (TS) and system under test (SUT), so a more authentic SUT performance can be obtained by tuning these parameters. We use Petri net and transfer matrix to describe the intricate testing workload characteristics, so configuration parameters for different database applications can easily be determined. We conduct three workload characteristics experiments basing on the TPC-App benchmark to validate the BTMF and the workload modeling approach.

Keywords

Performance testing benchmarking test framework 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Buchacker, K., Tschaeche, O.: TPC Benchmark-c version 5.2 Dependability Benchmark Extensions (2004), http://www3.informatik.uni-erlangen.de/Research/FAUmachine/papers/tpcc-depend.pdf (accessed in July 2009)
  2. 2.
    Costa, D., Rilho, T., Madeira, H.: Joint Evaluation of Performance and Robustness of a COTS DBMS through Fault-Injection. In: The Proc. of DSN 2000, NY, USA (2000)Google Scholar
  3. 3.
    Du, N.Q., Ye, X.J., Wang, J.M.: Toward Workflow-Driven Database System Workload Modeling. In: The Proc. of DBTest 2009, Providence, USA (2009)Google Scholar
  4. 4.
    Galanis, L., et al.: Oracle Database Replay. In: The Proc. of ACM SIGMOD 2008, Vancouver, BC, Canada (2008)Google Scholar
  5. 5.
    Gray, J. (ed.): The Benchmark Handbook for Database and Transaction Processing Systems. Morgan Kaufmann Publishers, San Francisco (1993)zbMATHGoogle Scholar
  6. 6.
    IBM. TPC BenchmarkTM App Full Disclosure Report for IBM® eServerTM xSeries® 366 using Microsoft® .NET 1.1 TPC-App Version 1.1 Submitted for Review (June 21, 2005) Google Scholar
  7. 7.
    HP LoadRunner, http://www.hp.com (accessed in July 2009)
  8. 8.
    Koziolek, H.: Introduction to Performance Metrics. In: Eusgeld, I., Freiling, F.C., Reussner, R. (eds.) Dependability Metrics. LNCS, vol. 4909, pp. 199–203. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  9. 9.
    Osogami, T., Kato, S.: Optimizing System Configurations Quickly by Guessing at the Performance. In: The Proc. of SIGMETRICS 2007, San Diego, USA (2007)Google Scholar
  10. 10.
    Seng, J.L., Yao, S.B., Hevner, A.R.: Requirements-Driven Database Systems Benchmark Method. Decision Support Systems 38, 629–648 (2005)CrossRefGoogle Scholar
  11. 11.
    Swisher, J.R., Jacobson, S.H., Yucesan, E.: Discrete-Event Simulation Optimization Using Ranking, Selection, and Multiple Comparison Procedures: A Survey. ACM Transactions on Modeling and Computer Simulation 13(2), 134–154 (2003)CrossRefGoogle Scholar
  12. 12.
    Transaction Processing Performance Council, TPC-C/App/E BENCHMARKTM Standard Specification, http://www.tpc.org (accessed in July 2009)
  13. 13.
    Xie, J.M., Ye, X.J.: A Configurable Web Service Performance Testing Framework. In: Proc. of IEEE HPCC 2008, Dalian, China (2008)Google Scholar
  14. 14.
    Zhang, Y., Qu, W., Liu, A.: Automatic Performance Tuning for J2EE Application Server Systems. In: Ngu, A.H.H., et al. (eds.) WISE 2005. LNCS, vol. 3806, pp. 520–527. Springer, Heidelberg (2005)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Xiaojun Ye
    • 1
  • Jingmin Xie
    • 1
  • Jianmin Wang
    • 1
  • Hao Tang
    • 1
  • Naiqiao Du
    • 1
  1. 1.Key Laboratory for Information System Security, Ministry of Education Tsinghua National Laboratory for Information Science and Technology School of SoftwareTsinghua UniversityBeijingChina

Personalised recommendations