Skip to main content

Using Metadata to Improve Experiment Reliability in Shared Environments

  • Conference paper
Traffic Monitoring and Analysis (TMA 2012)

Part of the book series: Lecture Notes in Computer Science ((LNCCN,volume 7189))

Included in the following conference series:

Abstract

Experimental network research is subject to challenges since the experiment outcomes can be influenced by undesired effects from other activities in the network. In shared experiment networks, control over resources is often limited and QoS guarantees might not be available. When the network conditions vary during a series of experiment unwanted artifacts can be introduced in the experimental results, reducing the reliability of the experiments. We propose a novel, systematic, methodology where network conditions are monitored during the experiments and information about the network is collected. This information, known as metadata, is analyzed statistically to identify periods during the experiments when the network conditions have been similar. Data points collected during these periods are valid for comparison. Our hypothesis is that this methodology can make experiments more reliable. We present a proof-of-concept implementation of our method, deployed in the FEDERICA and PlanetLab networks.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 54.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 69.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Albrecht, J.: Achieving experiment repeatability on planetlab. In: NSF Workshop on Archiving Experiments to Raise Scientific Standards, NSF (2010)

    Google Scholar 

  2. Bickel, P.: A distribution free version of the smirnov two sample test in the p-variate case. The Annals of Mathematical Statistics 40(1), 1–23 (1969)

    Article  MathSciNet  MATH  Google Scholar 

  3. Cottrell, R.L.: Evaluation of techniques to detect significant network performance problems using End-to-End active network measurements. Technical report, NASA Center for AeroSpace Information (2006)

    Google Scholar 

  4. Ferrari, D., Zhou, S.: An empirical investigation of load indices for load balancing applications. Defense Technical Information Center (1987)

    Google Scholar 

  5. Fukunaga, K.: Introduction to statistical pattern recognition. Academic Pr. (1990)

    Google Scholar 

  6. Gelper, S., Fried, R., Croux, C.: Robust forecasting with exponential and Holt-Winters smoothing. Journal of Forecasting 29(3), 285–300 (2010)

    MathSciNet  MATH  Google Scholar 

  7. Guralnik, V., Srivastava, J.: Event detection from time series data. In: Proceedings of the Fifth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 33–42. ACM (1999)

    Google Scholar 

  8. Hagsand, O.: Meter v2.2.2, http://www.nada.kth.se/~olofh/meter/ (accessed October 19, 2011)

  9. Jones, E., Oliphant, T., Peterson, P.: SciPy: Open source scientific tools for Python (2001), http://www.scipy.org/ (accessed October 19, 2011)

  10. Lee, S.-J., Sharma, P., Banerjee, S., Basu, S., Fonseca, R.: Measuring Bandwidth Between PlanetLab Nodes. In: Dovrolis, C. (ed.) PAM 2005. LNCS, vol. 3431, pp. 292–305. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  11. Liao, T.W.: Clustering of time series data–a survey. Pattern Recognition 38(11), 1857–1874 (2005)

    Article  MATH  Google Scholar 

  12. Massey, F.J.: The Kolmogorov-Smirnov test for goodness of fit. Journal of the American Statistical Association 46(253), 68–78 (1951)

    Article  MATH  Google Scholar 

  13. McGregor, A.J., Braun, H.W.: Automated event detection for active measurement systems. In: Proceedings of PAM 2001 (2001)

    Google Scholar 

  14. Montgomery, D.: Design and analysis of experiments. John Wiley & Sons Inc. (2008)

    Google Scholar 

  15. Olsson, R.: Pktgen the linux packet generator. In: Linux Symposium 2005 (2005)

    Google Scholar 

  16. Park, K., Pai, V.S.: CoMon: a mostly-scalable monitoring system for PlanetLab. SIGOPS Operating Systems Review 40(1), 65–74 (2006)

    Article  Google Scholar 

  17. Paxson, V.: Strategies for sound internet measurement. In: Proceedings of the 4th ACM SIGCOMM Conference on Internet Measurement, pp. 263–271. ACM (2004)

    Google Scholar 

  18. Peterson, L., Roscoe, T.: The design principles of PlanetLab. SIGOPS Operating Systems Review 40(1), 11–16 (2006)

    Article  Google Scholar 

  19. Rahm, E., Do, H.H.: Data cleaning: Problems and current approaches. IEEE Bulletin on Data Engineering, 3 (2000)

    Google Scholar 

  20. Roscoe, T.: 33. The PlanetLab Platform. In: Steinmetz, R., Wehrle, K. (eds.) P2P Systems and Applications. LNCS, vol. 3485, pp. 567–581. Springer, Heidelberg (2005)

    Google Scholar 

  21. Rosenbaum, P.: An exact distribution-free test comparing two multivariate distributions based on adjacency. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 67(4), 515–530 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  22. Seibert, J., Zage, D., Fahmy, S., Nita-Rotaru, C.: Experimental comparison of peer-to-peer streaming overlays: An application perspective. In: Proceedings of the the 33rd IEEE Conference on Local Computer Networks, pp. 20–27 (2008)

    Google Scholar 

  23. Sollins, K.: RFC 1350: The TFTP protocol (Revision 2) (1992)

    Google Scholar 

  24. Sommers, J., Barford, P.: An active measurement system for shared environments. In: Proceedings of the 7th ACM SIGCOMM Conference on Internet Measurement, IMC 2007, pp. 303–314. ACM, New York (2007)

    Google Scholar 

  25. Spring, N., Peterson, L., Bavier, A., Pai, V.: Using PlanetLab for network research: myths, realities, and best practices. ACM SIGOPS Operating Systems Review 40(1), 24 (2006)

    Article  Google Scholar 

  26. Szegedi, P., Riera, J., Garcia-Espin, J., Hidell, M., Sjodin, P., Soderman, P., Ruffini, M., O’Mahony, D., Bianco, A., Giraudo, L., et al.: Enabling future internet research: the federica case. IEEE Communications Magazine 49(7), 54–61 (2011)

    Article  Google Scholar 

  27. Wang, G., Ng, T.: The impact of virtualization on network performance of amazon ec2 data center. In: 2010 Proceedings IEEE INFOCOM, pp. 1–9. IEEE (2010)

    Google Scholar 

  28. Whiteaker, J., Schneider, F., Teixeira, R.: Explaining packet delays under virtualization. SIGCOMM Comput. Commun. Rev. 41, 38–44

    Google Scholar 

  29. Wijk, J.J.V., Selow, E.R.V.: Cluster and calendar based visualization of time series data. In: Infovis, p. 4. IEEE Computer Society (1999)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 IFIP International Federation for Information Processing

About this paper

Cite this paper

Söderman, P., Hidell, M., Sjödin, P. (2012). Using Metadata to Improve Experiment Reliability in Shared Environments. In: Pescapè, A., Salgarelli, L., Dimitropoulos, X. (eds) Traffic Monitoring and Analysis. TMA 2012. Lecture Notes in Computer Science, vol 7189. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-28534-9_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-28534-9_15

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-28533-2

  • Online ISBN: 978-3-642-28534-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics