Skip to main content

Taming the Raven – Testing the Random Access, Visualization and Exploration Network RAVEN

  • Conference paper

Part of the Lecture Notes in Computer Science book series (LNTCS,volume 7134)

Abstract

The Random Access, Visualization and Exploration Network (RAVEN) aims to allow for the storage, analysis and visualisation of peta-bytes of scientific data in (near) real-time. In essence, RAVEN is a huge distributed and parallel system.

While testing of distributed systems, such as huge telecommunication systems, is well understood and performed systematically, testing of parallel systems, in particular high-performance computing, is currently lagging behind and is mainly based on ad-hoc approaches.

This paper surveys the state of the art of software testing and investigates challenges of testing a distributed and parallel high-performance RAVEN system. While using the standardised Testing and Test Control Notation (TTCN-3) looks promising for testing networking and communication aspects of RAVEN, testing the visualisation and analysis aspects of RAVEN may open new frontiers.

Keywords

  • Testing
  • Distributed Systems
  • Parallel Systems
  • High-Performance Computing
  • TTCN-3

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (Canada)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (Canada)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (Canada)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Schmelling, M., Britsch, M., Gagunashvili, N., Gudmundsson, H.K., Neukirchen, H., Whitehead, N.: RAVEN – Boosting Data Analysis for the LHC Experiments. In: Jónasson, K. (ed.) PARA 2010, Part II. LNCS, vol. 7134, pp. 206–214. Springer, Heidelberg (2012)

    Google Scholar 

  2. ISO/IEC: Information Technology – Open Systems Interconnection – Conformance testing methodology and framework. International ISO/IEC multipart standard No. 9646 (1994-1997)

    Google Scholar 

  3. Wallace, D.R., Fujii, R.U.: Software verification and validation: An overview. IEEE Software 6, 10–17 (1989)

    CrossRef  Google Scholar 

  4. Myers, G.: The Art of Software Testing. Wiley (1979)

    Google Scholar 

  5. Dijkstra, E.: Notes on Structured Programming. Technical Report 70-WSK-03, Technological University Eindhoven, Department of Mathematics (April 1970)

    Google Scholar 

  6. Beizer, B.: Black-Box Testing. Wiley (1995)

    Google Scholar 

  7. Weyuker, E.: Axiomatizing Software Test Data Adequacy. IEEE Transactions on Software Engineering 12(12) (December 1986)

    Google Scholar 

  8. Weyuker, E.: The Evaluation of Program-based Software Test Data Adequacy Criteria. Communications of the ACM 31(6) (June 1988), doi:10.1145/62959.62963

    Google Scholar 

  9. ETSI: ETSI Standard (ES) 201 873 V4.2.1: The Testing and Test Control Notation version 3; Parts 1–10. European Telecommunications Standards Institute (ETSI), Sophia-Antipolis, France (2010)

    Google Scholar 

  10. Din, G., Tolea, S., Schieferdecker, I.: Distributed Load Tests with TTCN-3. In: Uyar, M.Ü., Duale, A.Y., Fecko, M.A. (eds.) TestCom 2006. LNCS, vol. 3964, pp. 177–196. Springer, Heidelberg (2006), doi:10.1007/11754008_12

    CrossRef  Google Scholar 

  11. Din, G.: An IMS Performance Benchmark Implementation based on the TTCN-3 Language. International Journal on Software Tools for Technology Transfer (STTT) 10(4), 359–370 (2008), doi:10.1007/s10009-008-0078-x

    CrossRef  Google Scholar 

  12. Rings, T., Neukirchen, H., Grabowski, J.: Testing Grid Application Workflows Using TTCN-3. In: International Conference on Software Testing Verification and Validation (ICST), pp. 210–219. IEEE Computer Society (2008), doi:10.1109/ICST.2008.24

    Google Scholar 

  13. Grabowski, J., Hogrefe, D., Réthy, G., Schieferdecker, I., Wiles, A., Willcock, C.: An introduction to the testing and test control notation (TTCN-3). Computer Networks 42(3), 375–403 (2003), doi:10.1016/S1389-1286(03)00249-4

    CrossRef  MATH  Google Scholar 

  14. Willcock, C., Deiß, T., Tobies, S., Keil, S., Engler, F., Schulz, S.: An Introduction to TTCN-3. Wiley, New York (2005)

    CrossRef  Google Scholar 

  15. ETSI: TTCN-3, http://www.ttcn-3.org

  16. Fidge, C.: Fundamentals of distributed system observation. IEEE Software 13, 77–83 (1996)

    CrossRef  Google Scholar 

  17. Apache Software Foundation: Apache Hadoop, http://hadoop.apache.org/

  18. Law, A., Kelton, W.: Simulation Modeling and Analysis. McGraw-Hill (200)

    Google Scholar 

  19. Skadron, K., Martonosi, M., August, D., Hill, M., Lilja, D., Pai, V.: Challenges in Computer Architecture Evaluation. IEEE Computer 36(8), 30–36 (2003), doi:10.1109/MC.2003.1220579

    CrossRef  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Neukirchen, H. (2012). Taming the Raven – Testing the Random Access, Visualization and Exploration Network RAVEN. In: Jónasson, K. (eds) Applied Parallel and Scientific Computing. PARA 2010. Lecture Notes in Computer Science, vol 7134. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-28145-7_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-28145-7_20

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-28144-0

  • Online ISBN: 978-3-642-28145-7

  • eBook Packages: Computer ScienceComputer Science (R0)