A Simplified Model for Simulating the Execution of a Workflow in Cloud

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10417)

Abstract

Although simulators provide approximate, faster and easier simulation of an application execution in Clouds, still many researchers argue that these results cannot be always generalized for complex application types, which consist of many dependencies among tasks and various scheduling possibilities, such as workflows. DynamicCloudSim, the extension of the well known CloudSim simulator, offers users the capability to simulate the Cloud heterogeneity by introducing noisiness in dozens parameters. Still, it is difficult, or sometimes even impossible to determine appropriate values for all these parameters because they are usually Cloud or application-dependent. In this paper, we propose a new model that simplifies the simulation setup for a workflow and reduces the bias between the behavior of simulated and real Cloud environments based on one parameter only, the Cloud noisiness. It represents the noise produced by the Cloud’s interference including the application’s (in our case a workflow) noisiness too. Another novelty in our model is that it does not use a normal distribution naively to create noised values, but shifts the mean value of the task execution time by the cloud noisiness and uses its deviation as a standard deviation. Besides our model reduces the complexity of DynamicCloudSim’s heterogeneity model, evaluation conducted in Amazon EC2 shows that it is also more accurate, with better trueness (closeness to the real mean values) of up to \(9.2 \%\) and precision (closeness to the real deviation) of up to 8.39 times.

Keywords

Accuracy Makespan Modeling Precision Simulator Trueness 

References

  1. 1.
    Alkhanak, E.N., Lee, S.P., Rezaei, R., Parizi, R.M.: Cost optimization approaches for scientific workflow scheduling in cloud and grid computing: A review, classifications, and open issues. J. Syst. Softw. 113, 1–26 (2016)CrossRefGoogle Scholar
  2. 2.
    Armbrust, M., Fox, A., Griffith, R., Joseph, A.D., Katz, R., Konwinski, A., Lee, G., Patterson, D., Rabkin, A., Stoica, I., Zaharia, M.: A view of cloud computing. Commun. ACM 53(4), 50–58 (2010)CrossRefGoogle Scholar
  3. 3.
    Basu, A., Fleming, S., Stanier, J., Naicken, S., Wakeman, I., Gurbani, V.K.: The state of peer-to-peer network simulators. ACM Comput. Surv. 45(4), 46:1–46:25 (2013)Google Scholar
  4. 4.
    Bux, M., Leser, U.: DynamicCloudSim: Simulating heterogeneity in computational clouds. Future Gen. Comput. Syst. 46, 85–99 (2015)CrossRefGoogle Scholar
  5. 5.
    Calheiros, R.N., Ranjan, R., Beloglazov, A., De Rose, C.A., Buyya, R.: CloudSim: a toolkit for modeling and simulation of cloud computing environments and evaluation of resource provisioning algorithms. Softw.: Practice Exp. 41(1), 23–50 (2011)Google Scholar
  6. 6.
    Chen, W., Deelman, E.: WorkflowSim: A toolkit for simulating scientific workflows in distributed environments. In: 2012 IEEE 8th International Conference on E-Science (e-Science), pp. 1–8, October 2012Google Scholar
  7. 7.
    Dejun, J., Pierre, G., Chi, C.-H.: EC2 performance analysis for resource provisioning of service-oriented applications. In: Dan, A., Gittler, F., Toumani, F. (eds.) ICSOC/ServiceWave -2009. LNCS, vol. 6275, pp. 197–207. Springer, Heidelberg (2010). doi:10.1007/978-3-642-16132-2_19 CrossRefGoogle Scholar
  8. 8.
    Depoorter, W., Moor, N., Vanmechelen, K., Broeckhove, J.: Scalability of grid simulators: an evaluation. In: Luque, E., Margalef, T., Benítez, D. (eds.) Euro-Par 2008. LNCS, vol. 5168, pp. 544–553. Springer, Heidelberg (2008). doi:10.1007/978-3-540-85451-7_58 CrossRefGoogle Scholar
  9. 9.
    Di, S., Cappello, F.: GloudSim: Google trace based cloud simulator with virtual machines. Softw. Pract. Exper. 45(11), 1571–1590 (2015)CrossRefGoogle Scholar
  10. 10.
    Fard, H.M., Ristov, S., Prodan, R.: Handling the uncertainty in resource performance for executing workflow applications in clouds. In: 9th IEEE/ACM International Conference on Utility and Cloud Computing, UCC 2016, pp. 89–98. ACM (2016). http://doi.acm.org/10.1145/2996890.2996902
  11. 11.
    Iosup, A., Yigitbasi, N., Epema, D.: On the performance variability of production cloud services. In: 2011 11th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (CCGrid), pp. 104–113, May 2011Google Scholar
  12. 12.
  13. 13.
    Jackson, K.R., Ramakrishnan, L., Muriki, K., Canon, S., Cholia, S., Shalf, J., Wasserman, H.J., Wright, N.J.: Performance analysis of high performance computing applications on the Amazon web services cloud. In: 2010 IEEE Second International Conference on Cloud Computing Technology and Science (CloudCom), pp. 159–168, November 2010Google Scholar
  14. 14.
    Juve, G., Chervenak, A., Deelman, E., Bharathi, S., Mehta, G., Vahi, K.: Characterizing and profiling scientific workflows. Future Gen. Comput. Syst. 29(3), 682–692 (2013). http://www.sciencedirect.com/science/article/pii/S0167739X12001732. Special Section: Recent Developments in High Performance Computing and Security
  15. 15.
    Malawski, M., Juve, G., Deelman, E., Nabrzyski, J.: Cost- and deadline-constrained provisioning for scientific workflow ensembles in IaaS clouds. In: Proceedings of the International Conference on HPC, Networking, Storage and Analysis, SC 2012, pp. 1–11 (2012)Google Scholar
  16. 16.
    Ostermann, S., Plankensteiner, K., Prodan, R., Fahringer, T.: GroudSim: An event-based simulation framework for computational grids and clouds. In: Guarracino, M.R., et al. (eds.) Euro-Par 2010. LNCS, vol. 6586, pp. 305–313. Springer, Heidelberg (2011). doi:10.1007/978-3-642-21878-1_38 CrossRefGoogle Scholar
  17. 17.
    Ostermann, S., Prodan, R., Fahringer, T.: Extending grids with cloud resource management for scientific computing. In: 2009 10th IEEE/ACM International Conference on Grid Computing, pp. 42–49. IEEE (2009)Google Scholar
  18. 18.
    Ristov, S., Mathá, R., Prodan, R.: Analysing the performance instability correlation with various workflow and cloud parameters. In: 2017 25th Euromicro International Conference on Parallel, Distributed and Network-based Processing (PDP), pp. 446–453, March 2017Google Scholar
  19. 19.
    Schad, J., Dittrich, J., Quiané-Ruiz, J.A.: Runtime measurements in the cloud: Observing, analyzing, and reducing variance. Proc. VLDB Endow. 3(1–2), 460–471 (2010)CrossRefGoogle Scholar
  20. 20.
    Tchernykh, A., Schwiegelsohn, U., Alexandrov, V., ghazali Talbi, E.: Towards understanding uncertainty in cloud computing resource provisioning. Procedia Comput. Sci. 51, 1772–1781 (2015). International Conference on Computational Science (2015)Google Scholar
  21. 21.
    Tian, W., Xu, M., Chen, A., Li, G., Wang, X., Chen, Y.: Open-source simulators for cloud computing: comparative study and challenging issues. Simul. Modell. Practice Theory 58(Part 2), 239–254 (2015). http://www.sciencedirect.com/science/article/pii/S1569190X15000970. Special issue on Cloud Simulation
  22. 22.
    Wu, F., Wu, Q., Tan, Y.: Workflow scheduling in cloud: A survey. J. Supercomput. 71(9), 3373–3418. http://dx.doi.org/10.1007/s11227-015-1438-4
  23. 23.
    Zhao, Y., Li, Y., Raicu, I., Lu, S., Lin, C., Zhang, Y., Tian, W., Xue, R.: A service framework for scientific workflow management in the cloud. IEEE Trans. Serv. Comput. 8(6), 930–944 (2015)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Institute for Computer ScienceUniversity of InnsbruckInnsbruckAustria

Personalised recommendations