Advertisement

SOA and the Button Problem

  • Sung-Shik JongmansEmail author
  • Arjan Lamers
  • Marko van Eekelen
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11800)

Abstract

Service-oriented architecture (SOA) is a popular architectural style centered around services, loose coupling, and interoperability. A recurring problem in SOA development is the Button Problem; how to ensure that whenever a “button is pressed” on some service—no matter what—the performance of other key services remains unaffected? The Button Problem is especially complex to solve in systems that have devolved into hardly comprehensible spaghettis of service dependencies.

In a collaborative effort with industry partner First8, we present the first formal framework to help SOA developers solve the Button Problem, enabling automated reasoning about service sensitivities and candidate refactorings. Our formalization provides a rigorous foundation for a tool that was already successfully evaluated in industrial case studies, and it is built against two unique requirements: “whiteboard level of abstraction” and non-quantitative analysis.

References

  1. 1.
    Akamai Technologies Inc.: Akamai Online Retail Performance Report \(|\) Akamai (2017). Accessed 28 June 2019. https://www.akamai.com/uk/en/about/news/press/2017-press/akamai-releases-spring-2017-state-of-online-retail-performance-report.jsp
  2. 2.
    AppDynamics LLC: Application Performance Monitoring and Management \(|\) AppDynamics (nd). Accessed 28 June 2019. https://www.appdynamics.com
  3. 3.
    Bertoli, M., Casale, G., Serazzi, G.: JMT: performance engineering tools for system modeling. SIGMETRICS Perform. Eval. Rev. 36(4), 10–15 (2009)CrossRefGoogle Scholar
  4. 4.
    Bischofberger, W., Kühl, J., Löffler, S.: Sotograph – a pragmatic approach to source code architecture conformance checking. In: Oquendo, F., Warboys, B.C., Morrison, R. (eds.) EWSA 2004. LNCS, vol. 3047, pp. 1–9. Springer, Heidelberg (2004).  https://doi.org/10.1007/978-3-540-24769-2_1CrossRefGoogle Scholar
  5. 5.
    Brebner, P.: Real-world performance modelling of enterprise service oriented architectures: delivering business value with complexity and constraints (abstracts only). SIGMETRICS Perform. Eval. Rev. 39(3), 12 (2011)CrossRefGoogle Scholar
  6. 6.
    Caracciolo, A., Lungu, M.F., Nierstrasz, O.: A unified approach to architecture conformance checking. In: WICSA, pp. 41–50. IEEE Computer Society (2015)Google Scholar
  7. 7.
    Datadog Inc.: Modern monitoring and analytics \(|\) Datadog (nd). Accessed 28 June 2019. https://www.datadoghq.com
  8. 8.
    Dynatrace LLC: Software intelligence for the enterprise cloud \(|\) Dynatrace (nd). Accessed 28 June 2019. https://www.dynatrace.com
  9. 9.
    van Eekelen, M., Jongmans, S.S., Lamers, A.: Non-Quantitative Modeling of Service-Oriented Architectures, Refactorings, and Performance. Technical Report TR-OU-INF-2017-02, Open University of the Netherlands (2017)Google Scholar
  10. 10.
    Elmo Demo (2018). Accessed 28 June 2019. https://youtu.be/Oi9kxqh_GBs
  11. 11.
    Headway Software Technologies Ltd.: Structure101 Home – Structure101 (nd). Accessed 28 June 2019. https://structure101.com
  12. 12.
    Johnsen, E.B., Pun, K.I., Tapia Tarifa, S.L.: A formal model of cloud-deployed software and its application to workflow processing. In: SoftCOM, pp. 1–6. IEEE (2017)Google Scholar
  13. 13.
    Juan Ferrer, A., et al.: OPTIMIS: a holistic approach to cloud service provisioning. Future Gener. Comp. Syst. 28(1), 66–77 (2012)CrossRefGoogle Scholar
  14. 14.
    Kobryn, C.: Modeling components and frameworks with UML. Commun. ACM 43(10), 31–38 (2000)CrossRefGoogle Scholar
  15. 15.
    Kounev, S.: Performance modeling and evaluation of distributed component-based systems using queueing petri nets. IEEE Trans. Softw.Eng. 32(7), 486–502 (2006)CrossRefGoogle Scholar
  16. 16.
    Lamers, A., van Eekelen, M., Jongmans, S.-S.: Improved architectures/deployments with elmo. In: Liu, X., et al. (eds.) ICSOC 2018. LNCS, vol. 11434, pp. 419–424. Springer, Cham (2019).  https://doi.org/10.1007/978-3-030-17642-6_36CrossRefGoogle Scholar
  17. 17.
    Milner, R.: Communication and concurrency. PHI Series in computer science. Prentice Hall, New Jersey (1989)Google Scholar
  18. 18.
    New Relic Inc.: New Relic \(|\) Deliver more perfect software (nd). Accessed 28 June 2019. https://www.newrelic.com
  19. 19.
    Pautasso, C., Zimmermann, O., Amundsen, M., Lewis, J., Josuttis, N.M.: Microservices in practice, part 1: reality check and service design. IEEE Software 34(1), 91–98 (2017)CrossRefGoogle Scholar
  20. 20.
    Peled, D.: Partial-order reduction. In: Clarke, E., Henzinger, T., Veith, H., Bloem, R. (eds.) Handbook of Model Checking, pp. 173–190. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-10575-8_6
  21. 21.
    SonarSource SA: Continuous Inspection \(|\) SonarQube (nd). Accessed 28 June 2019. https://www.sonarqube.org
  22. 22.
    Zhu, L., Liu, Y., Bui, N.B., Gorton, I.: Revel8or: model driven capacity planning tool suite. In: ICSE, pp. 797–800. IEEE Computer Society (2007)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Sung-Shik Jongmans
    • 1
    • 2
    Email author
  • Arjan Lamers
    • 1
    • 3
  • Marko van Eekelen
    • 1
    • 4
  1. 1.Department of Computer ScienceOpen University of the NetherlandsHeerlenThe Netherlands
  2. 2.CWINetherlands Foundation of Scientific Research InstitutesAmsterdamThe Netherlands
  3. 3.First8NijmegenThe Netherlands
  4. 4.Institute for Computing and Information SciencesRadboud University NijmegenNijmegenThe Netherlands

Personalised recommendations