MPI as a coordination layer for communicating HPF tasks

PDF Version Also Available for Download.

Description

Data-parallel languages such as High Performance Fortran (HPF) present a simple execution model in which a single thread of control performs high-level operations on distributed arrays. These languages can greatly ease the development of parallel programs. Yet there are large classes of applications for which a mixture of task and data parallelism is most appropriate. Such applications can be structured as collections of data-parallel tasks that communicate by using explicit message passing. Because the Message Passing Interface (MPI) defines standardized, familiar mechanisms for this communication model, the authors propose that HPF tasks communicate by making calls to a coordination library ... continued below

Physical Description

12 p.

Creation Information

Foster, I.T.; Kohr, D.R. Jr.; Krishnaiyer, R. & Choudhary, A. December 31, 1996.

Context

This article is part of the collection entitled: Office of Scientific & Technical Information Technical Reports and was provided by UNT Libraries Government Documents Department to Digital Library, a digital repository hosted by the UNT Libraries. More information about this article can be viewed below.

Who

People and organizations associated with either the creation of this article or its content.

Authors

Sponsors

Publisher

Provided By

UNT Libraries Government Documents Department

Serving as both a federal and a state depository library, the UNT Libraries Government Documents Department maintains millions of items in a variety of formats. The department is a member of the FDLP Content Partnerships Program and an Affiliated Archive of the National Archives.

Contact Us

What

Descriptive information to help identify this article. Follow the links below to find similar items on the Digital Library.

Description

Data-parallel languages such as High Performance Fortran (HPF) present a simple execution model in which a single thread of control performs high-level operations on distributed arrays. These languages can greatly ease the development of parallel programs. Yet there are large classes of applications for which a mixture of task and data parallelism is most appropriate. Such applications can be structured as collections of data-parallel tasks that communicate by using explicit message passing. Because the Message Passing Interface (MPI) defines standardized, familiar mechanisms for this communication model, the authors propose that HPF tasks communicate by making calls to a coordination library that provides an HPF binding for MPI. The semantics of a communication interface for sequential languages can be ambiguous when the interface is invoked from a parallel language; they show how these ambiguities can be resolved by describing one possible HPF binding for MPI. They then present the design of a library that implements this binding, discuss issues that influenced the design decisions, and evaluate the performance of a prototype HPF/MPI library using a communications microbenchmark and application kernel. Finally, they discuss how MPI features might be incorporated into the design framework.

Physical Description

12 p.

Notes

OSTI as DE97000693

Source

  • 1996 Message Passing Interface (MPI) developers conference, Notre Dame, IN (United States), 1-2 Jul 1996

Language

Item Type

Identifier

Unique identifying numbers for this article in the Digital Library or other systems.

  • Other: DE97000693
  • Report No.: ANL/MCS-P--597-0596
  • Report No.: CONF-9607124--5
  • Grant Number: W-31109-ENG-38
  • Office of Scientific & Technical Information Report Number: 418494
  • Archival Resource Key: ark:/67531/metadc677660

Collections

This article is part of the following collection of related materials.

Office of Scientific & Technical Information Technical Reports

What responsibilities do I have when using this article?

When

Dates and time periods associated with this article.

Creation Date

  • December 31, 1996

Added to The UNT Digital Library

  • July 25, 2015, 2:20 a.m.

Description Last Updated

  • Aug. 24, 2016, 1:38 p.m.

Usage Statistics

When was this article last used?

Yesterday: 0
Past 30 days: 0
Total Uses: 2

Interact With This Article

Here are some suggestions for what to do next.

Start Reading

PDF Version Also Available for Download.

Citations, Rights, Re-Use

Foster, I.T.; Kohr, D.R. Jr.; Krishnaiyer, R. & Choudhary, A. MPI as a coordination layer for communicating HPF tasks, article, December 31, 1996; Illinois. (digital.library.unt.edu/ark:/67531/metadc677660/: accessed September 23, 2017), University of North Texas Libraries, Digital Library, digital.library.unt.edu; crediting UNT Libraries Government Documents Department.