Performance and Cost Assessment of Cloud Services

  • Paul Brebner
  • Anna Liu
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6568)


Architecting applications for the Cloud is challenging due to significant differences between traditional hosting and Cloud infrastructure setup, unknown and unproven Cloud performance and scalability characteristics, as well as variable quota limitations. Building workable cloud applications therefore requires in-depth insight into the architectural and performance characteristics of each cloud offering, and the ability to reason about tradeoffs and alternatives of application designs and deployments. NICTA has developed a Service Oriented Performance Modeling technology for modeling the performance and scalability of Service Oriented applications architected for a variety of platforms. Using a suite of cloud testing applications we conducted in-depth empirical evaluations of a variety of real cloud infrastructures, including Google App Engine, Amazon EC2, and Microsoft Azure. The insights from these experimental evaluations, and other public/published data, were combined with the modeling technology to predict the resource requirements in terms of cost, application performance, and limitations of a realistic application for different deployment scenarios.


Cloud performance scalability cost limits quotas service-oriented performance modeling (SOPM) Amazon EC2 Google AppEngine Microsoft Azure 


  1. 1.
    Brebner, P., O’Brien, L., Gray, J.: Performance Modeling for e-Government Service Oriented Architectures. In: Experience Report Proceedings, ASWEC 2008, pp. 130–138 (2008)Google Scholar
  2. 2.
    Brebner, P.: Performance modeling for service oriented architectures. In: ICSE Companion 2008, pp. 953–954 (2008), doi:
  3. 3.
    Brebner, P., O’Brien, L., Gray, J.: Performance Modeling Evolving Enterprise Service Oriented Architectures. In: WISA/ECSA 2009, pp. 71–80 (2009), doi: 10.1109/WICSA.2009.5290793Google Scholar
  4. 4.
    Brebner, P.: Service-Oriented Performance Modeling the MULE Enterprise Service Bus Loan Broker Application. In: SEAA 2009, pp. 404–411 (2009), doi:10.1109/SEAA.2009.57Google Scholar
  5. 5.
    Brebner, P., O’Brien, L., Gray, J.: Performance Modeling Power Consumption and Carbon Emissions for Server Virtualization of Service Oriented Architectures. In: Proceedings of the IEEE EDOC 2009 Workshops, Middleware for Web Services Workshop 2009, pp. 92–99 (2009), doi:10.1109/EDOCW.2009.5332010Google Scholar
  6. 6.
    Wood, T., Cherkasova, L., Ozonat, K., Shenoy, P.: Predicting Application Resource Requirements in Virtual Environments. HP Laboratories, Technical Report HPL-2008-122 (2008),
  7. 7.
  8. 8.
  9. 9.
  10. 10.
  11. 11.
  12. 12.
    Testing Cloud Computing Performance with PRTG: Performance Comparison of Amazon EC2 Instance Types,
  13. 13.
    Lamp performance on the elastic compute cloud: benchmarking drupal on amazon ec2,
  14. 14.
    MySQL on EC2 Part 1: are all instances equal?,
  15. 15.
    Walker, E.: Benchmarking Amazon EC2 for high-performance scientific computing,
  16. 16.
  17. 17.
    EC2 spot price history,
  18. 18.
  19. 19.
  20. 20.
  21. 21.
  22. 22.
    Google AppEngine quotas and limits,
  23. 23.
  24. 24.
  25. 25.
  26. 26.
    Barker, S., Shenoy, P.: Empirical Evaluation of Latency-sensitive Applications Performance in the Cloud,
  27. 27.
    Alexandru-Dorin, G.: Network performance in virtual infrastructures – A closer look at Amazon EC2,
  28. 28.
  29. 29.
  30. 30.
  31. 31.
  32. 32.
    Are Clouds ready for large distributed applications?,
  33. 33.
    Bolden, B.: The Skinny Straw: Cloud Computing’s Bottleneck and How to Address it,
  34. 34.
  35. 35.
  36. 36.
  37. 37.

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Paul Brebner
    • 1
  • Anna Liu
    • 2
  1. 1.NICTA/ANUAustralia
  2. 2.NICTA/UNSWAustralia

Personalised recommendations