Design of n-Gram Based Dynamic Pre-fetching for DSM

  • Sitaramaiah Ramisetti
  • Rajeev Wankar
  • C. R. Rao
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7440)

Abstract

Many earlier reported works have shown that data pre-fetching can be an efficient answer to the well-known memory stalls. If one can reduce these stalls, it leads to performance improvement in terms of overall execution time for a given application. In this paper we propose a new n-gram model for prediction, which is based on dynamic pre-fetcher, in which we compute conditional probabilities of the stride sequences of previous n steps. Here n is an integer which indicates data elements. The strides that are already pre-fetched are preserved so that we can ignore them if the same stride number is referenced by the program due to principle of locality of reference, with the fact that it is available in the memory, hence we need not pre-fetch it. The model also gives the best probable and least probable stride sequences, this information can further be used for dynamic prediction. Experimental results show that the proposed model is far efficient and presents user certain additional input about the behavior of the application. The model flushes once number of miss-predictions exceed pre-determined limit. One can improve the performance of the existing compiler based Software Distributed Shared Memory (SDSM) systems using this model.

Keywords

SDSM n-gram pre fetching Markov Chain 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Beyls, K., D’Hollander, E.: Compile-time cache hint generation for epic architectures. In: Proceedings of the 2nd workshop on Explicitly Parallel Instruction Computing Architectures and Compiler Techniques (November 2002)Google Scholar
  2. 2.
    Beyler, J.C., Clauss, P.: ESODYP: An entirely software and dynamic data prefetcher based on a Markov model. In: Proceedings of the 12th Workshop on Compilers for Parallel Computers, A Coruna, Spain, pp. 118–132 (January 2006)Google Scholar
  3. 3.
    Brown, P.F., DeSouza, P.V., Mercer, R.L., Della Pietra, V.J., Lai, J.C.: Class-Based n-gram Models of Natural Language. Journal of Computational Linguistic Archive 18(4) (December 1992)Google Scholar
  4. 4.
    Veldema, R., Bhoedjang, R.A.F., Bal, H.E.: JACKAL, A compiler based Implementation of Java for cluster of workstations. In: Proceedings of SIGPLAN’s Principles and Practices of Parallel Computing, PPoPP 2001 (2001)Google Scholar
  5. 5.
    Klemm, M., Beyler, J.C., Lampert, R.T., Philippsen, M., Clauss, P.: Esodyp+: Prefetching in the Jackal Software DSM. In: Kermarrec, A.-M., Bougé, L., Priol, T. (eds.) Euro-Par 2007. LNCS, vol. 4641, pp. 563–573. Springer, Heidelberg (2007)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Sitaramaiah Ramisetti
    • 1
  • Rajeev Wankar
    • 2
  • C. R. Rao
    • 2
  1. 1.CVR College of Engineering HyderabadIndia
  2. 2.Department of Computer and Information SciencesUniversity of HyderabadHyderabadIndia

Personalised recommendations