Compiling Data Parallel Tasks for Coordinated Execution⋆
Many advanced scientific applications are heterogeneous and multidisciplinary in nature, consisting of multiple, independent modules. Such applications require efficient means of coordination for their program units. The programming language Opus was designed recently to assist in coordinating the execution of multiple, independent program modules. In this paper we address the problem of how to compile an Opus program such that it can be eficiently executed on a broad class of machines.
KeywordsOpus Semantic Runtime System Data Parallelism Task Parallelism Opus Compilation
Unable to display preview. Download preview PDF.
- H.E. Bal, M.F. Kaashoek, and A.S. Tanenbaum. Orca: A Language For Parallel Programming of Distributed Systems. IEEE Transactions on Software Engineering, Vol. 18(No. 3), March 1992.Google Scholar
- S. Benkner. VFC: The Vienna Fortran Compiler. Journal of Scientific Programming, 7(1):67–81, December 1998.Google Scholar
- B. Chapman, M. Haines, P. Mehrotra, J.Van Rosendale, and H. Zima. OPUS: A Coordination Language for Multidisciplinary Applications. Scientific Programming, 6/9:345–362, Winter 1997.Google Scholar
- I. Foster and K.M. Chandy. Fortran M: A Language for Modular Parallel Programming. Journal of Parallel and Distributed Computing, Vol. 26, 1995.Google Scholar
- S.B. Hassen and H.E. Bal. Integrating Task and Data Parallelism Using Shared Objects. In 10th ACM International Conference on Supercomputing, Philadelphia, PA, May 1996.Google Scholar
- E. Laure, M. Haines, P. Mehrotra, and H. Zima. On the Implementation of the Opus Coordination Language. Technical Report TR 99-03, Institute for Software Technology and Parallel Systems, May 1999.Google Scholar