AlbertEinsteinInstitut www.aeipotsdam.mpg.de - PowerPoint PPT Presentation

1 / 38
About This Presentation
Title:

AlbertEinsteinInstitut www.aeipotsdam.mpg.de

Description:

Albert-Einstein-Institut www. ... Albert-Einstein-Institut. MPI ... Albert-Einstein-Institut www.aei-potsdam.mpg.de. Any Such Computation ... – PowerPoint PPT presentation

Number of Views:67
Avg rating:3.0/5.0
Slides: 39
Provided by: eds74
Category:

less

Transcript and Presenter's Notes

Title: AlbertEinsteinInstitut www.aeipotsdam.mpg.de


1
Cactus Developing Parallel Computational Tools
to Study Black Hole, Neutron Star (or
Airplane...) Collisions
  • Solving Einsteins Equations, Black Holes, and
    Gravitational Wave Astronomy
  • Cactus, a new community simulation code framework
  • Toolkit for many PDE systems
  • Suite of solvers for Einstein and astrophysics
    systems
  • Recent Simulations using Cactus
  • Black Hole Collisions, Neutron Star Collisions
  • Collapse of Gravitational Waves
  • Aerospace test project
  • Metacomputing for the general user what a
    scientist really wants and needs
  • Distributed Computing Experiments with
    Cactus/Globus

Ed Seidel Albert-Einstein-Institut MPI-Gravitation
sphysik NCSA/U of IL
2
Einsteins Equations and Gravitational Waves
  • Einsteins General Relativity
  • Fundamental theory of Physics (Gravity)
  • Among most complex equations of physics
  • Dozens of coupled, nonlinear hyperbolic-elliptic
  • equations with 1000s of terms
  • Barely have capability to solve after a century
  • Predict black holes, gravitational waves, etc.
  • Exciting new field about to be born
    Gravitational Wave Astronomy
  • Fundamentally new information about Universe
  • What are gravitational waves?? Ripples in
    spacetime curvature, caused by matter motion,
    causing distances to change
  • A last major test of Einsteins theory do the
    exist?
  • Eddington Gravitational waves propagate at the
    speed of thought
  • 1993 Nobel Prize Committee Hulse-Taylor Pulsar
    (indirect evidence)
  • 20xx Nobel Committee ??? (For actual
    detection)

3
Waveforms We Want to Compute What Happens in
Nature...
PACS Virtual Machine Room
4
Black Holes Excellent source of waves
  • Need Cosmic Cataclysms to provide strong waves!
  • BHs have very strong gravity, collide near speed
    of light, have 3-100 solar masses!
  • May collide frequently
  • Not very often local region of space, but..
  • Perhaps 3 per year within 200Mpc, range of
    detectors
  • Need to have some idea what the signals will look
    like if we are to detect and understand them

5
Einstein Equations New Formulations, New
Capabilities
  • Einstein Eqs. Gmn(gij) 8pTmn
  • Traditional Evolution Equations ADM
  • ?tt g S(a,b,g,g,g) (Think Maxwell ?E/ ?t
    Curl B, ?B/?t - Curl E)
  • S(a,b,g,g,g) has thousands of terms (very
    ugly!)
  • 4 nonlinear elliptic constraints (Think Maxwell
    Div B Div E 0)
  • 4 gauge conditions (often elliptic) (Think
    Maxwell A --- A L)
  • Numerical Methods ad hoc. Not manifestly
    hyperbolic
  • NEW First Order Symmetric Hyperbolic
  • ?tu ?iFi(u) S(u)
  • u is a vector of many fields, typically of order
    50
  • Complete set of Eigenfields (under certain
    conditions)
  • Many variations on these formulations, dozens of
    papers since 1992
  • Elliptic equations still there

6
Computational Needs for 3D Numerical Relativity
  • Explicit Finite Difference Codes
  • 104 Flops/zone/time step
  • 100 3D arrays
  • Require 10003 zones or more
  • 1000 Gbytes
  • Double resolution 8x memory, 16x Flops
  • TFlop, Tbyte machine required
  • Parallel AMR, I/O essential
  • A code that can do this could be useful to other
    projects (we said this in all our grant
    proposals)!
  • Last 2 years devoted to making this useful across
    disciplines
  • All tools used for these complex simulations
    available for other branches of science,
    engineering...

t100
t0
  • InitialData 4 coupled nonlinear elliptics
  • Evolution
  • hyperbolic evolution
  • coupled with elliptic eqs.

7
Any Such Computation Requires Incredible Mix of
Varied Technologies and Expertise!
  • Many Scientific/Engineering Components
  • Physics, astrophysics, CFD, engineering,...
  • Many Numerical Algorithm Components
  • Finite difference methods? Unstructured meshes?
  • Elliptic equations multigrid, Krylov subspace,
    preconditioners,...
  • Mesh Refinement?
  • Many Different Computational Components
  • Parallelism (HPF, MPI, PVM, ???)
  • Architecture Efficiency (MPP, DSM, Vector, PC
    Clusters, ???)
  • I/O Bottlenecks (generate gigabytes per
    simulation, checkpointing)
  • Visualization of all that comes out!
  • Scientist wants to focus on top bullet, but all
    required for results...

8
This is fundamental question addressed by Cactus.
  • Clearly need teams, with huge expertise base to
    attack such problems...
  • In fact, need collections of communities to solve
    such problems...
  • But how can they work together effectively?
  • We need a simulation code environment that
    encourages this...
  • These are the fundamental issues addressed by
    Cactus.
  • Providing advanced comp. Science to
    scientists/engineers
  • Providing collaborative infrastructure for
    large groups

9
Grand Challenges NSF Black Hole and NASA
Neutron Star Projects
  • NCSA/Illinois/AEI (Saylor, Seidel, Swesty,
    Norman)
  • Argonne (Foster)
  • Washington U (Suen)
  • Livermore (Ashby)
  • Stony Brook (Lattimer)
  • University of Texas (Matzner, Browne),
  • NCSA/Illinois/AEI (Seidel, Saylor, Smarr,
    Shapiro, Saied)
  • North Carolina (Evans, York)
  • Syracuse (G. Fox)
  • Cornell (Teukolsky)
  • Pittsburgh (Winicour)
  • Penn State (Laguna,
  • Finn)

NEW! EU Network
10
What we learn from Grand Challenges
  • Successful, but also problematic
  • No existing infrastructure to support
    collaborative HPC
  • Many scientists are bad Fortran programmers, and
    NOT computer scientists (especially
    physicistslike me)
  • Many sociological issues of large collaborations
    and different cultures
  • Many language barriers...
  • Applied mathematicians, computational
  • scientists, physicists have very different
    concepts
  • and vocabularies
  • Code fragments, styles, routines often clash
  • Successfully merged code (after years) often
    impossible to transplant into more modern
    infrastructure (e.g., add AMR or switch to MPI)
  • Many serious problems...

11
Cactusnew concept in community developed
simulation code infrastructure
  • Developed as response to needs of these projects
  • Numerical/computational infrastructure to solve
    PDEsFreely available, open community source
    code spirit of gnu/linux
  • Cactus Divided in Flesh (core) and Thorns
    (modules or collections of subroutines)
  • User apps can be Fortran, C, C automated
    interface between them
  • Parallelism abstracted and hidden (if desired)
    from user
  • User specifies flow when to call thorns code
    switches memory on/off
  • Many parallel utilities / features enabled by
    Cactus
  • (nearly) All architectures supported
  • Dec Alpha / SGI Origin 2000 / T3E / Linux
    clusters laptops / Hitachi /NEC/HP/Windows NT/
    SP2, Sun
  • Code portability, migration to new architectures
    very easy!

12
Modularity of Cactus...
Sub-app
Application 1b
Application 1a
...
Application 2
User selects desired functionality...
Cactus Flesh
Remote Steer 3
AMR (Grace, etc)
MPI layer 1
I/O layer 2
Globus Metcomputing Services
13
Computational Toolkit provides parallel
utilities (thorns) for computational scientist
  • Cactus is a framework or middleware for unifying
    and incorporating code from Thorns developed by
    the community
  • Choice of parallel library layers (Native MPI,
    MPICH, MPICH-G(2), LAM, WMPI, PACX and HPVM)
  • Portable, efficient (T3E, SGI, Dec Alpha, Linux,
    NT Clusters)
  • 3 mesh refinement schemes Nested Boxes, GrACE,
    HLL (coming)
  • Parallel I/O (Panda, FlexIO, HDF5, etc)
  • Parameter Parsing
  • Elliptic solvers (Petsc, Multigrid, SOR, etc)
  • Visualization Tools, Remote steering tools, etc
  • Globus (metacomputing/resource management)
  • Performance analysis tools (Autopilot, PAPI,
    etc)
  • INSERT YOUR CS MODULE HERE...

14
PAPI
  • Standard API for accessing the hardware
    performance counters on most microprocessors.
  • Useful for tuning, optimization, debugging,
    benchmarking, etc.
  • Java GUI available for monitoring the metrics
  • Cactus thorn CactusPerformance/PAPI

http//icl.cs.utk.edu/projects/papi/ http//www.ca
ctuscode.org/Documentation/HOWTO/Performance-HOWTO
http//www.cactuscode.org/Projects.html
15
GrACE
  • Parallel/distributed AMR via C library
  • Abstracts Grid Hierarchies, Grid Functions and
    Grid Geometries
  • CactusPAGH will include a driver thorn which uses
    GrACE to provide AMR (KDI ASC Project)

http//www.caip.rutgers.edu/parashar/TASSL/Projec
ts/GrACE/index.html http//www.cactuscode.org/Work
shops/NCSA99/talk23/index.htm
16
How to use Cactus Avoiding the MONSTER code
syndrome...
  • Optional Develop thorns, according to some
    rules
  • e.g. specify variables through interface.ccl
  • Specify calling sequence of the thorns for given
    problem and algorithm (schedule.ccl)
  • Specify which thorns are desired for simulation
    (Einstein equations special method 1 HRSC
    hydrowave finder AMR live visualization
    module remote steering tool)
  • Specified code is then created, with only those
    modules, those variables, those I/O routines,
    this MPI layer, that AMR system,, needed
  • Subroutine calling lists generated automatically
  • Automatically created for desired computer
    architecture
  • Run it

17
Cactus Computational Tool Kit
  • Flesh (core) written in C
  • Thorns (modules) grouped in packages written in
    F77, F90, C, C
  • Thorn-Flesh interface fixed in 3 files written in
    CCL (Cactus Configuration Language)
  • interface.ccl Grid Functions, Arrays, Scalars
    (integer, real, logical, complex)
  • param.ccl Parameters and their allowed values
  • schedule.ccl Entry point of routines, dynamic
    memory and communication allocations
  • Object oriented features for thorns (public,
    private, protected variables, implementations,
    inheritance) for clearer interfaces
  • Compilation
  • PERL parses the CCL files and creates the
    flesh-thorn interface code at compile time
  • Particularly important for the FORTRAN-C
    interface. FORTRAN arg. lists must be known at
    compile time, but depend on the thorn list

18
High performance Full 3D Einstein Equations
solved on NCSA NT Supercluster, Origin 2000, T3E
  • Excellent scaling on many architectures
  • Origin up to 256 processors
  • T3E up to 1024
  • NCSA NT cluster up to 128 processors
  • Achieved 142 Gflops/s on 1024 node T3E-1200
    (benchmarked for NASA NS Grand Challenge)
  • But, of course, we want much more metacomputing,
    meaning connected computers...

19
Cactus Development Projects
DLR
Numerical Relativity
Astrophysics
AEI Cactus Group (Allen)
EU Network (Seidel)
NSF KDI (Suen)
Geophysics
NASA Round 2 (Saylor) Round 3??
DFN Gigabit (Seidel)
Grid Forum
NCSA
Applications
Egrid
GRADS
Microsoft
Computational Science
20
Applications
  • Black Holes (prime source for GW)
  • Increasingly complex collisions now doing
  • full 3D grazing collisions
  • Gravitational Waves
  • Study linear waves as testbeds
  • Move on to fully nonlinear waves
  • Interesting Physics BH formation in full 3D!
  • Neutron Stars
  • Developing capability to do full GR hydro
  • Now can follow full orbits!
  • DLR project working to explore capabilities for
    aerospace industry

21
Evolving Pure Gravitational Waves
  • Einsteins equations nonlinear, so low amplitude
    waves just propagate away, but large amplitude
    waves may
  • Collapse on themselves under their own
    self-gravity and actually form black holes
  • Use numerical relativity Probe GR in highly
    nonlinear regime
  • Form BH?, Critical Phenomena in 3D?, Naked
    singularities?
  • Little known about generic 3D behavior
  • Take Lump of Waves and evolve
  • Large amplitude get BH to form!
  • Below critical value disperses and can evolve
    forever as system returns to flat space
  • We are seeing hints of critical phenomena, known
    from nonlinear dynamics

22
Comparison sub vs. super-critical solutions
Supercritical BH forms!
Subcritical no BH forms
Newman-Penrose Y4 (showing gravitational
waves) with lapse a underneath
23
Numerical Black Hole Evolutions
  • Binary IVP Multiple Wormhole Model, other
    models
  • Black Holes good candidates for Gravitational
    Waves Astronomy
  • 3 events per years within 200Mpc
  • But what are the waveforms?
  • GW astronomers want to know!

24
Now try first 3D Grazing Collision Big Step
Spinning, orbiting, unequal mass BHs merging.
Horizon merger
Alcubierre et al results
3843, 100GB simulation, Largest production
relativity Simulation 256 processor Origin 2000
Evolution of Y4 in x-z plane (rotation plane of
BH)
25
Our Team Requires Grid Technologies, Big
Machines for Big Runs
Paris
Hong Kong
ZIB
NCSA
AEI
WashU
Thessaloniki
  • How Do We
  • Maintain/develop Code?
  • Manage Computer Resources?
  • Carry Out/monitor Simulation?

PACS Virtual Machine Room
26

Aerospace Applications
  • Cactus Portal, Distributed Simulation under
    active development at NASA-Ames
  • Deutsches Luft- und Raumfahrtzentrum (DLR) Pilot
    Project
  • a CFD code (Navier-Stokes with Turbulence model
    or Euler) with special extension to calculate
    turbine streams. Can be used for "normal" CFD
    problems as well.
  • based on finite volume discretization on a block
    structured regular cartesian grid.
  • has currently simple MPI parallelization.
  • Plugging into Cactus to evaluate

27
What we need and want in simulation science a
Portal to provide the following...
  • Got an idea? Write Cactus module, link to other
    modules, and
  • Find resources
  • Where? NCSA, SDSC, Garching, Boeing???
  • How many computers? Distribute Simulations?
  • Big jobs Fermilab at disposal must get it
    right while the beam is on!
  • Launch Simulation
  • How do get executable there?
  • How to store data?
  • What are local queue structure/OS idiosyncracies?
  • Monitor the simulation
  • Remote Visualization live while running
  • Limited bandwidth compute viz. inline with
    simulation
  • High bandwidth ship data to be visualized
    locally
  • Visualization server all privileged users can
    login and check status/adjust if necessary
  • Are parameters screwed up? Very complex!
  • Call in an expert colleaguelet her watch it too
  • Steer the simulation
  • Is memory running low? AMR! What to do? Refine
    selectively or acquire additional resources via
    Globus? Delete unnecessary grids?
  • Postprocessing and analysis

28
A Portal to Computational Science The Cactus
Collaboratory
1. User has science idea...
2. Composes/Builds Code Components w/Interface...
3. Selects Appropriate Resources...
4. Steers simulation, monitors performance...
5. Collaborators log in to monitor...
Want to integrate and migrate this technology to
the generic user...
29
Remote Visualization
OpenDX
Amira
Contour plots (download)
LCA Vision
IsoSurfaces and Geodesics
Grid FunctionsStreaming HDF5
Amira
30
Remote Visualization Tools under Development
  • Live data streaming from Cactus simulation to viz
    client
  • Clients OpenDX, Amira, LCA Vision, Xgraph
  • Protocols
  • Precomputed Viz run inline with the simulation
  • Isosurfaces, geodesics
  • HTTP
  • Parameters, xgraph data, Jpegs, viewed and
    controlled from any web browser
  • Streaming HDF5 sends raw data from resident
    memory of supercomputer
  • HDF5 provides downsampling and hyperslabbing
  • all above data, and all possible HDF5 data (e.g.
    2D/3D)
  • two different technologies
  • Streaming Virtual File Driver (I/O rerouted over
    network stream)
  • XML-wrapper (HDF5 calls wrapped and translated
    into XML)

31
Remote Steering
Any Viz Client
Remote Viz data
HTTP
XML
HDF5
Amira
Remote Viz data
32
Remote Steering
  • Stream parameters from Cactus simulation to
    remote client, which changes parameters (GUI,
    command line, viz tool), and streams them back to
    Cactus where they change the state of the
    simulation.
  • Cactus has a special STEERABLE tag for
    parameters, indicating it makes sense to change
    them during a simulation, and there is support
    for them to be changed.
  • Example IO parameters, frequency, fields
  • Current protocols
  • XML (HDF5) to standalone GUI
  • HDF5 to viz tools (Amira, Open DX, LCA Vision,
    etc)
  • HTTP to Web browser (HTML forms)

33
Remote Offline Visualization
VisualizationClient
Berlin
Downsampling, hyperslabs
??
Remote Data Server
2 TByte at NCSA
34
Metacomputing harnessing power when and where
it is needed
  • Einstein equations typical of apps that require
    extreme memory, speed
  • Largest supercomputers too small!
  • Networks very fast!
  • vBNS, etc in US
  • DFN Gigabit testbed 622 Mbits Potsdam-Berlin-Garc
    hing, connect multiple supercomputers
  • International gigabit networking possible
  • Connect workstations to make supercomputer
  • Acquire resources dynamically during simulation!
  • AMR, analysis, etc...
  • Seamless computing and visualization from
    anywhere
  • Many metacomputing experiments in progress
  • Current ANL/SDSC/NCSA/NERSC/ experiment in
    progress...

35
Metacomputing the Einstein EquationsConnecting
T3Es in Berlin, Garching, San Diego
Want to migrate this technology to the generic
user...
36
Grand Picture
Viz of data from previous simulations in SF café
Remote steering and monitoring from airport
Remote Viz in St Louis
Remote Viz and steering from Berlin
DataGrid/DPSS Downsampling
IsoSurfaces
http
HDF5
T3E Garching
Origin NCSA
Globus
Simulations launched from Cactus Portal
Grid enabled Cactus runs on distributed machines
37
The Future
  • Gravitational wave astronomy almost here must
    be able to solve Einsteins equations in detail
    to understand the new observations
  • New Codes, strong collaborations, bigger
    computers, new formulations of EEs together
    enabling much new progress.
  • Cactus Computational Toolkit developed orignally
    for Einsteins equations, available now for many
    applications (NOT an astrophysics code!)
  • Useful as a parallel toolkit for many
    applications, provides portability from laptop to
    many parallel architectures (e.g. cluster of
    iPaqs!)
  • Many advanced collaborative tools, portal for
    code compostion, resource selection,
    computational steering, remote viz under
    development
  • Advanced Grid-based metacomputing tools are
    maturing...

38
Further details...
  • Cactus
  • http//www.cactuscode.org
  • http//www.computer.org/computer/articles/einstein
    _1299_1.htm
  • Movies, research overview (needs major updating)
  • http//jean-luc.ncsa.uiuc.edu
  • Simulation Collaboratory/Portal Work
  • http//wugrav.wustl.edu/ASC/mainFrame.html
  • Remote Steering, high speed networking
  • http//www.zib.de/Visual/projects/TIKSL/
  • http//jean-luc.ncsa.uiuc.edu/Projects/Gigabit/
  • EU Astrophysics Network
  • http//www.aei-potsdam.mpg.de/research/astro/eu_ne
    twork/index.html
Write a Comment
User Comments (0)
About PowerShow.com