Advertisement

Dfuntest: A Testing Framework for Distributed Applications

  • Grzegorz Milka
  • Krzysztof RzadcaEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10777)

Abstract

New ideas in distributed systems (algorithms or protocols) are commonly tested by simulation, because experimenting with a prototype deployed on a realistic platform is cumbersome. However, a prototype not only measures performance but also verifies assumptions about the underlying system. We developed dfuntest—a testing framework for distributed applications that defines abstractions and test structure, and automates experiments on distributed platforms. Dfuntest aims to be jUnit’s analogue for distributed applications; a framework that enables the programmer to write robust and flexible scenarios of experiments. Dfuntest requires minimal bindings that specify how to deploy and interact with the application. Dfuntest’s abstractions allow execution of a scenario on a single machine, a cluster, a cloud, or any other distributed infrastructure, e.g. on PlanetLab. A scenario is a procedure; thus, our framework can be used both for functional tests and for performance measurements. We show how to use dfuntest to deploy our DHT prototype on 60 PlanetLab nodes and verify whether the prototype maintains a correct topology.

Keywords

Distributed systems Testing Deployment PlanetLab jUnit 

Notes

Acknowledgements

This research has been supported by a Polish National Science Center grant Sonata (UMO-2012/07/D/ST6/02440).

References

  1. 1.
    Automated software testing tools TestComplete. http://smartbear.com/product/testcomplete/overview/. Accessed 26 Sept 2017
  2. 2.
    Chef: Deploy new code faster and more frequently. http://www.chef.io. Accessed 26 Sept 2017
  3. 3.
    Nebulostore: a P2P storage system. http://nebulostore.org. Accessed 26 Sept 2017
  4. 4.
    Robot framework. http://robotframework.org/. Accessed 26 Sept 2017
  5. 5.
    Software testing automation framework (STAF). http://staf.sourceforge.net. Accessed 27 Sept 2017
  6. 6.
    Alvaro, P., Andrus, K., Sanden, C., Rosenthal, C., Basiri, A., Hochstein, L.: Automating failure testing research at internet scale. In: Proceedings SoCC, pp. 17–28. ACM (2016)Google Scholar
  7. 7.
    Butnaru, B., Dragan, F., Gardarin, G., Manolescu, I., Nguyen, B., Pop, R., Preda, N., Yeh, L.: P2PTester: a tool for measuring P2P platform performance. In: ICDE, pp. 1501–1502 (2007)Google Scholar
  8. 8.
    Chun, B., Culler, D., Roscoe, T., Bavier, A., Peterson, L., Wawrzoniak, M., Bowman, M.: Planetlab: an overlay testbed for broad-coverage services. ACM SIGCOMM Comput. Commun. Rev. 33(3), 3–12 (2003)CrossRefGoogle Scholar
  9. 9.
    De Almeida, E.C., Sunyé, G., Le Traon, Y., Valduriez, P.: Testing peer-to-peer systems. Empir. Softw. Eng. 15(4), 346–379 (2010)CrossRefGoogle Scholar
  10. 10.
    Duarte, A., Cirne, W., Brasileiro, F., Machado, P.: Gridunit: software testing on the grid. In: ICSE, pp. 779–782. ACM (2006)Google Scholar
  11. 11.
    Gamma, E., Helm, R., Johnson, R., Vlissides, J.: Design Patterns: Elements of Reusable Object-oriented Software. Addison-Wesley, Reading (1994)zbMATHGoogle Scholar
  12. 12.
    Hierons, R.M.: Oracles for distributed testing. IEEE Trans. Softw. Eng. 38(3), 629–641 (2012)CrossRefGoogle Scholar
  13. 13.
    Hierons, R.M., Ural, H.: The effect of the distributed test architecture on the power of testing. Comput. J. 51(4), 497–510 (2008)CrossRefGoogle Scholar
  14. 14.
    Hughes, D., Greenwood, P., Coulson, G.: A framework for testing distributed systems. In: P2P, pp. 262–263. IEEE (2004)Google Scholar
  15. 15.
    Maymounkov, P., Mazières, D.: Kademlia: a peer-to-peer information system based on the XOR metric. In: Druschel, P., Kaashoek, F., Rowstron, A. (eds.) IPTPS 2002. LNCS, vol. 2429, pp. 53–65. Springer, Heidelberg (2002).  https://doi.org/10.1007/3-540-45748-8_5 CrossRefGoogle Scholar
  16. 16.
    Oliveira, G.S.D., Duarte, A.: A framework for automated software testing on the cloud. In: PDCAT, pp. 344–349. IEEE (2013)Google Scholar
  17. 17.
    Pamies-Juarez, L., Garcípez, P., Sánchez-Artigas, M.: Enforcing fairness in P2P storage systems using asymmetric reciprocal exchanges. In: P2P, pp. 122–131. IEEE (2011)Google Scholar
  18. 18.
    Rellermeyer, J.S., Alonso, G., Roscoe, T.: Building, deploying, and monitoring distributed applications with eclipse and R-OSGi. In: OOPSLA, pp. 50–54. ACM (2007)Google Scholar
  19. 19.
    Rzadca, K., Datta, A., Buchegger, S.: Replica placement in P2P storage: complexity and game theoretic analyses. In: ICDCS, pp. 599–609. IEEE (2010)Google Scholar
  20. 20.
    Sigelman, B.H., Barroso, L.A., Burrows, M., Stephenson, P., Plakal, M., Beaver, D., Jaspan, S., Shanbhag, C.: Dapper, a large-scale distributed systems tracing infrastructure. Technical report, Google (2010)Google Scholar
  21. 21.
    Skowron, P., Rzadca, K.: Exploring heterogeneity of unreliable machines for P2P backup. In: HPCS, pp. 91–98. IEEE (2013)Google Scholar
  22. 22.
    Torens, C., Ebrecht, L.: RemoTetest: a framework for testing distributed systems. In: ICSEA, pp. 441–446. IEEE (2010)Google Scholar
  23. 23.
    Tsai, W.T., Yu, L., Saimi, A., Paul, R.: Scenario-based object-oriented test frameworks for testing distributed systems. In: FTDCS, pp. 288–294. IEEE (2003)Google Scholar
  24. 24.
    Ulrich, A.W., Zimmerer, P., Chrobok-Diening, G.: Test architectures for testing distributed systems. In: QW (1999)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Institute of InformaticsUniversity of WarsawWarsawPoland

Personalised recommendations