Experiences with Service-Oriented Middleware for Dynamic Instrumentation of Enterprise DRE Systems

  • James H. Hill
  • Douglas C. Schmidt
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7045)

Abstract

This paper describes our experiences applying a test and evaluation (T&E) service-oriented middleware framework called the Open-source Architecture for Software Instrumentation Systems (OASIS) to the Unified SHIP platform, which is a representative system for next-generation shipboard computing systems. The OASIS service-oriented middleware framework discussed in this paper enables instrumenting distributed software systems, such as enterprise distributed real-time and embedded (DRE) systems, to collect and extract metrics without a priori knowledge of the metrics collected. The flexibility of OASIS’s metametrics-driven approach to instrumentation and data collection increased developer and tester knowledge and analytical capabilities of end-to-end QoS in shipboard computing systems. This paper also discusses our strategy for deploying OASIS in a cloud environment.

Keywords

Object Management Group Cloud Computing Environment Common Object Request Broker Architecture Memory Probe Complex Event Processing 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bleiholder, J., Naumann, F.: Data Fusion. ACM Computing Surveys 41, 1:1–1:41 (2009), http://doi.acm.org/10.1145/1456650.1456651
  2. 2.
    Bruening, D., Garnett, T., Amarasinghe, S.: An Infrastructure for Adaptive Dynamic Optimization. In: Proceedings of the International Symposium on Code Generation and Optimization: Feedback-Directed and Runtime Optimization, CGO 2003, pp. 265–275. IEEE Computer Society, Washington, DC, USA (2003), http://portal.acm.org/citation.cfm?id=776261.776290 Google Scholar
  3. 3.
    Cantrill, B., Shapiro, M.W., Leventhal, A.H.: Dynamic Instrumentation of Production Systems. In: Proceedings of the General Track: 2004 USENIX Annual Technical Conference, pp. 15–28 (June 2004)Google Scholar
  4. 4.
    Chappell, D.: Introducing the windows azure platform (2009) (retrieved May 30, 2010)Google Scholar
  5. 5.
    Dekkers, P.: Complex Event Processing. Master’s thesis, Radboud University Nijmegen, Nijmegen, Netherlands (October 2007)Google Scholar
  6. 6.
    Domingues, P., Marques, P., Silva, L.: Distributed Data Collection through Remote Probing in Windows Environments. In: 13th Euromicro Conference on Parallel, Distributed and Network-Based Processing, PDP 2005, pp. 59–65. IEEE (2005)Google Scholar
  7. 7.
    Hill, J.H., Sutherland, H., Staudinger, P., Silveria, T., Schmidt, D.C., Slaby, J.M., Visnevski, N.: OASIS: An Architecture for Dynamic Instrumentation of Enterprise Distributed Real-time and Embedded Systems. International Journal of Computer Systems Science and Engineering, Special Issue: Real-time Systems (April 2011)Google Scholar
  8. 8.
    Hudgins, G., Poch, K., Secondine, J.: The Test and Training Enabling Architecture (TENA) Enabling Technology For The Joint Mission Environment Test Capability (JMETC) and Other Emerging Range Systems. In: Proceeding of U.S. Air Force T&E Days (2009)Google Scholar
  9. 9.
    Lenzerini, M.: Data Integration: A Theoretical Perspective. In: Proceedings of the Twenty-First ACM SIGMOD-SIGACT-SIGART Symposium on Principles of Database Systems PODS 2002, pp. 233–246. ACM, New York (2002), http://doi.acm.org/10.1145/543613.543644 CrossRefGoogle Scholar
  10. 10.
    Luk, C.K., Cohn, R., Muth, R., Patil, H., Klauser, A., Lowney, G., Wallace, S., Reddi, V.J., Hazelwood, K.: Pin: Building Customized Program Analysis Tools with Dynamic Instrumentation. SIGPLAN Notes 40, 190–200 (2005)CrossRefGoogle Scholar
  11. 11.
    Menasce, D.A., Dowdy, L.W., Almeida, V.A.F.: Performance by Design: Computer Capacity Planning By Example. Prentice Hall PTR, Upper Saddle River (2004)Google Scholar
  12. 12.
    Microsoft Corporation: Microsoft.NET Framework 3.0 Community (2007), http://www.netfx3.com
  13. 13.
    Object Management Group: Light Weight CORBA Component Model Revised Submission, OMG Document realtime/03-05-05 edn. (May 2003)Google Scholar
  14. 14.
    Object Management Group: Data Distribution Service for Real-time Systems Specification, 1.2 edn. (January 2007)Google Scholar
  15. 15.
    Object Management Group: The Common Object Request Broker: Architecture and Specification Version 3.1, Part 1: CORBA Interfaces, OMG Document formal/2008-01-04 edn. (January 2008)Google Scholar
  16. 16.
    Object Management Group: The Common Object Request Broker: Architecture and Specification Version 3.1, Part 2: CORBA Interoperability, OMG Document formal/2008-01-07 edn. (January 2008)Google Scholar
  17. 17.
    Object Management Group: The Common Object Request Broker: Architecture and Specification Version 3.1, Part 3: CORBA Component Model, OMG Document formal/2008-01-08 edn. (January 2008)Google Scholar
  18. 18.
    O’Hair, K.: The JVMPI Transition to JVMTI (2006), http://java.sun.com/developer/technicalArticles/Programming/jvmpitransition
  19. 19.
    Radha, V., Ramakrishna, S., kumar, N.P.: Generic XML Schema Definition (XSD) to GUI Translator. In: Chakraborty, G. (ed.) ICDCIT 2005. LNCS, vol. 3816, pp. 290–296. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  20. 20.
    Schmidt, D.C., Stal, M., Rohnert, H., Buschmann, F.: Pattern-Oriented Software Architecture: Patterns for Concurrent and Networked Objects, vol. 2. Wiley & Sons, New York (2000)MATHGoogle Scholar
  21. 21.
    Smith, C., Williams, L.: Performance Solutions: A Practical Guide to Creating Responsive, Scalable Software. Addison-Wesley Professional, Boston (2001)Google Scholar
  22. 22.
    Srivastava, A., Eustace, A.: ATOM: A System for Building Customized Program Analysis Tools. In: PLDI 1994: Proceedings of the ACM SIGPLAN 1994 Conference on Programming Language Design and Implementation, pp. 196–205 (1994)Google Scholar
  23. 23.
    Stefani, A., Xenos, M.N.: Meta-metric Evaluation of E-Commerce-related Metrics. Electronic Notes in Theoretical Computer Science (ENTCS) 233, 59–72 (2009)CrossRefGoogle Scholar
  24. 24.
    Tan, Z., Leal, W., Welch, L.: Verification of Instrumentation Techniques for Resource Management of Real-time Systems. J. Syst. Softw. 80(7), 1015–1022 (2007)CrossRefGoogle Scholar
  25. 25.
    Varia, J.: Cloud architectures. White Paper of Amazon, jineshvaria. s3. amazonaws. com/public/cloudarchitectures-varia. pdf (2008)Google Scholar
  26. 26.
    Visnevski, N.: Embedded Instrumentation Systems Architecture. In: Proceedings of IEEE International Instrumentation and Measurement Technology Conference (May 2008)Google Scholar
  27. 27.
    Waddington, D.G., Roy, N., Schmidt, D.C.: Dynamic Analysis and Profiling of Multi-threaded Systems. In: Tiako, P.F. (ed.) Designing Software-Intensive Systems: Methods and Principles, Idea Group (2007)Google Scholar
  28. 28.
    White, B., Lepreau, J., Stoller, L., Ricci, R., Guruprasad, S., Newbold, M., Hibler, M., Barb, C., Joglekar, A.: An integrated experimental environment for distributed systems and networks. In: Proc. of the Fifth Symposium on Operating Systems Design and Implementation, pp. 255–270. USENIX Association, Boston (2002)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • James H. Hill
    • 1
    • 2
  • Douglas C. Schmidt
    • 3
  1. 1.Indiana UniversityIndianapolisUSA
  2. 2.Purdue UniversityIndianapolisUSA
  3. 3.Carnegie Mellon UniversityPittsburghUSA

Personalised recommendations