Survey of MPI Call Usage - PowerPoint PPT Presentation

1 / 15
About This Presentation
Title:

Survey of MPI Call Usage

Description:

Information on MPI Usage is scarce. New tools (e.g. mpiP) make profiling reasonable ... Mpi-io usage info needed. Classified applications. Acknowledgements ... – PowerPoint PPT presentation

Number of Views:43
Avg rating:3.0/5.0
Slides: 16
Provided by: terryjon
Learn more at: http://www.spscicomp.org
Category:
Tags: mpi | call | survey | usage

less

Transcript and Presenter's Notes

Title: Survey of MPI Call Usage


1
Survey of MPI Call Usage
Daniel Han, USC
Terry Jones, LLNL
August 12, 2004
UCRL-PRES-206265
2
Outline
  • Motivation
  • About the Applications
  • Statistics Gathered
  • Inferences
  • Future Work

3
Motivation
  • Info for App developers
  • Information on the expense of basic MPI functions
    (recode?)
  • Set expectations
  • Many tradeoffs available in MPI design
  • Memory allocation decisions
  • Protocol cutoff point decisions
  • Where is additional code complexity worth it?
  • Information on MPI Usage is scarce
  • New tools (e.g. mpiP) make profiling reasonable
  • Easy to incorporate (no source code changes)
  • Easy to interpret
  • Unobtrusive observation (little performance
    impact)

4
About the applications
Amtran discrete coordinate neutron transport
Ares instability 3-D simulation in
massive star supernova envelopesArdra neutron
transport/radiation diffusion code exploring new
numerical algorithms and methods for the solution
of the Boltzmann Transport Equation (e.g. nuclear
imaging).Geodyne eulerian adaptive mesh
refinement (e.g. comet-earth impacts) IRS
solves the radiation transport equation by the
flux-limiting diffusion approximation using an
implicit matrix solutionMdcask molecular
dynamics codes for study in radiation damage in
metalsLinpack/HPL solves a random dense linear
system.Miranda hydrodynamics code simulating
instability growth Smg a parallel
semicoarsening multigrid solver for the linear
systems arising from finite difference, volume,
or finite element discretizationsSpheral
provides a steerable parallel environment for
performing coupled hydrodynamical gravitational
numerical simulations http//sourceforge.net/proje
cts/spheral Sweep3d solves a 1-group neuron
transport problemUmt2k photon transport code
for unstructured meshes
5
Percent of time to MPI
6
Top MPI Point-to-Point Calls
7
Top MPI Collective Calls
8
Comparing Collective and Point-to-Point
9
Average Number of Calls for Most Common MPI
Functions
Large Runs
10
Communication Patternsmost dominant msgsize
11
Communication Patterns (continued)
12
Frequency of callsites by MPI functions
13
Scalability
14
Observations Summary
  • General
  • People seem to scale code to 60
    MPI/communication
  • Isend/Irecv/Wait many times more prevalent than
    Sendrecv and blocking send/recv
  • Time spent in collectives predominantly divided
    among barrier, allreduce, broadcast, gather, and
    alltoall
  • Most common msgsize is typically between 1K and
    1MB
  • Surprises
  • Waitany most prevalent call
  • Almost all pt2pt messages are the same size
    within a run
  • Often, message size decreases with large runs
  • Some codes driven by alltoall performance

15
Future Work Concluding Remarks
  • Further understanding of apps needed
  • Results for other test configurations
  • When can apps make better use of collectives
  • Mpi-io usage info needed
  • Classified applications
  • Acknowledgements
  • mpiP is due to Jeffrey Vetter and Chris Chambreau
    http//www.llnl.gov/CASC/mpip
  • This work was performed under the auspices of the
    U.S. Department of Energy by University of
    California Lawrence Livermore National Laboratory
    under contract No. W-7405-Eng-48.
Write a Comment
User Comments (0)
About PowerShow.com