When Free Is Not Really Free: What Does It Cost to Run a Database Workload in the Cloud?

  • Avrilia Floratou
  • Jignesh M. Patel
  • Willis Lang
  • Alan Halverson
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7144)


The current computing trend towards cloud-based Database-as-a-Service (DaaS) as an alternative to traditional on-site relational database management systems (RDBMSs) has largely been driven by the perceived simplicity and cost-effectiveness of migrating to a DaaS. However, customers that are attracted to these DaaS alternatives may find that the range of different services and pricing options available to them add an unexpected level of complexity to their decision making. Cloud service pricing models are typically ‘pay-as-you-go’ in which the customer is charged based on resource usage such as CPU and memory utilization. Thus, customers considering different DaaS options must take into account how the performance and efficiency of the DaaS will ultimately impact their monthly bill. In this paper, we show that the current DaaS model can produce unpleasant surprises – for example, the case study that we present in this paper illustrates a scenario in which a DaaS service powered by a DBMS that has a lower hourly rate actually costs more to the end user than a DaaS service that is powered by another DBMS that charges a higher hourly rate. Thus, what we need is a method for the end-user to get an accurate estimate of the true costs that will be incurred without worrying about the nuances of how the DaaS operates. One potential solution to this problem is for DaaS providers to offer a new service called Benchmark as a Service (BaaS) where in the user provides the parameters of their workload and SLA requirements, and get a price quote.


Cloud Computing Cloud Service Price Model Service Level Agreement Price Quote 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Amazon Relational Database Service,
  2. 2.
    Armbrust, M., Fox, A., Griffith, R., Joseph, A.D., Katz, R.H., Konwinski, A., Lee, G., Patterson, D.A., Rabkin, A., Zaharia, M.: Above the Clouds: A Berkeley View of Cloud Computing (2009)Google Scholar
  3. 3.
    Binnig, C., Kossmann, D., Kraska, T., Loesing, S.: How is the weather tomorrow?: towards a benchmark for the cloud. In: DBTest (2009)Google Scholar
  4. 4.
    Bose, S., Mishra, P., Sethuraman, P., Taheri, R.: Benchmarking Database Performance in a Virtual Environment. In: Nambiar, R., Poess, M. (eds.) TPCTC 2009. LNCS, vol. 5895, pp. 167–182. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  5. 5.
    Cooper, B.F., Silberstein, A., Tam, E., Ramakrishnan, R., Sears, R.: Benchmarking cloud serving systems with ycsb. In: SoCC, pp. 143–154 (2010)Google Scholar
  6. 6.
    Cryans, J.-D., April, A., Abran, A.: Criteria to Compare Cloud Computing with Current Database Technology. In: Dumke, R.R., Braungarten, R., Büren, G., Abran, A., Cuadrado-Gallego, J.J. (eds.) IWSM 2008. LNCS, vol. 5338, pp. 114–126. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  7. 7.
    DeWitt, D.J.: The wisconsin benchmark: Past, present, and future. In: Gray, J. (ed.) The Benchmark Handbook. Morgan Kaufmann (1993)Google Scholar
  8. 8.
    Garfinkel, S.L.: An evaluation of amazon’s grid computing services: Ec2, s3 and sqs. Technical report (2007)Google Scholar
  9. 9.
    Kossmann, D., Kraska, T., Loesing, S.: An evaluation of alternative architectures for transaction processing in the cloud. In: SIGMOD Conference, pp. 579–590 (2010)Google Scholar
  10. 10.
    Minhas, U.F., Yadav, J., Aboulnaga, A., Salem, K.: Database systems on virtual machines: How much do you lose? In: ICDE Workshops, pp. 35–41 (2008)Google Scholar
  11. 11.
  12. 12.
  13. 13.
    Schad, J., Dittrich, J., Quiané-Ruiz, J.-A.: Runtime measurements in the cloud: Observing, analyzing, and reducing variance. PVLDB 3(1), 460–471 (2010)Google Scholar
  14. 14.
    Sethuraman, P., Reza Taheri, H.: TPC-V: A Benchmark for Evaluating the Performance of Database Applications in Virtual Environments. In: Nambiar, R., Poess, M. (eds.) TPCTC 2010. LNCS, vol. 6417, pp. 121–135. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  15. 15.
    Sobel, W., Subramanyam, S., Sucharitakul, A., Nguyen, J., Wong, H., Klepchukov, A., Patil, S., Fox, O., Patterson, D.: Cloudstone: Multi-platform, multi-language benchmark and measurement tools for web 2.0 (2008)Google Scholar
  16. 16.

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Avrilia Floratou
    • 1
  • Jignesh M. Patel
    • 1
  • Willis Lang
    • 1
  • Alan Halverson
    • 2
  1. 1.University of Wisconsin-MadisonU.S.A.
  2. 2.Microsoft Jim Gray Systems LabU.S.A.

Personalised recommendations