HPCMP Overview for the CASC Tools for Discovery - PowerPoint PPT Presentation

1 / 66
About This Presentation
Title:

HPCMP Overview for the CASC Tools for Discovery

Description:

HPCMP Overview for the CASC Tools for Discovery – PowerPoint PPT presentation

Number of Views:150
Avg rating:3.0/5.0
Slides: 67
Provided by: HPC86
Category:
Tags: casc | hpcmp | aqw | discovery | hhh | overview | tools

less

Transcript and Presenter's Notes

Title: HPCMP Overview for the CASC Tools for Discovery


1
HPCMP Overview for the CASC Tools for Discovery
Department of DefenseHigh Performance Computing
Modernization Program
  • Cray J. Henry
  • March 2005

http//www.hpcmo.hpc.mil
2
(No Transcript)
3
(No Transcript)
4
Agenda
  • Background History
  • DoD Computational Science Community
  • Examples of Research and Engineering
    Accomplishments
  • HPCMP
  • Resources Allocation
  • HPC Centers
  • Networking
  • Software Applications Support
  • An Opportunity

5
A Quick History
  • Program upgrades and operations 19972000
  • Continuous upgrades at MSRCs
  • Selection of new distributed centers (DCs)
  • DoD Challenge Projects established
  • A new round of acquisitions 200103
  • New DREN contract let with WorldCom
  • New support contracts at MSRCs
  • Annual technology insertion process at MSRCs
  • Program upgrades and operations 200409
  • Program initiation 199293
  • HPC Modernization Plan
  • Initial program structure established
  • Initial HPC capabilities provided
  • Program formalization 199495
  • Program office established
  • DoD oversight process implemented
  • Program structure and customer base expanded
  • Major acquisitions 199596
  • Four major shared resource centers (MSRCs)
  • Defense Research and Engineering Network (DREN)

6
Program Scope
This plan covers HPC in support of the DoD
science and technology program under the
cognizance of the DUSD(ST). DoD
HPC Modernization Plan, Jun 94
funds appropriated in this Act for the HPC
Modernization Program shall be made available
only for the upgrade, purchase, or modernization
of supercomputing capability and capacity at DoD
science and technology sites under the cognizance
of the DUSD(ST) and DoD test and evaluation
facilities under the Director of Test and
Evaluation, OUSD(AT) Provided, That the
contracts, contract modifications, contract
options, or other agreements are awarded as the
result of full and open competition based upon
the requirements of the user. National
Defense Appropriations Act for FY95
funds appropriated in this Act for the High
Performance Computing Modernization Program (1)
the DoD Science and Technology sites under the
cognizance of the DUSD(ST), (2) the DoD Test and
Evaluation centers under the Director, Test and
Evaluation, OUSD (AT), and (3) the Ballistic
Missile Defense Organization Provided, That the
contracts, contract modifications, or contract
options are awarded upon the requirements of the
users
National Defense Appropriations Act for FY96
7
HPCMP Chain of Command
8
Current User Base and Requirements
  • 561 projects and 4,572 users at approximately 179
    sites
  • Requirements categorized in 10 Computational
    Technology Areas (CTA)
  • FY 2005 non-real-time requirements of 260
    teraFLOPS-years

60 users are self characterized as other
9
Computational Technology Areas (CTAs)
10
Key Characteristics by Primary CTA
Number of Projects (FY 05)
Number of Users (FY 05)
11
(No Transcript)
12
Examples of Research Engineering
Accomplishments
13
Stores Certification Testing
For Official Use Only
HPC corrects wind tunnel data interference tests
on the munition.
Store Certification Issues -- The Old Way
14
Our Ambitions Are GreatMultiple-scale/Multi-comp
onent Applications Codes
USER REQMENT
COMBAT CAPABILITY
MS / Analyze
Recommend FC
Quick reaction process must have validated models
and tools ready when need arises
Flt Test?
No
Yes
  • Future Goals for Aerodynamics Software
  • Provide first model quicker, better, cheaper
  • Continuously improve models thru collaboration
  • Speed response to warfighter less test

15
F-15E/CBU104-WCMD Separations Support
Flight test store separation comparison
Stores Certification Issues The New Way
16
Aerosol Modeling of Dust Storms in IRAQ
  • The Naval Research Lab group at Montery, CA, has
    recently added an aerosol modeling capability to
    the Coupled Ocean/Atmosphere Prediction System
    (COAMPS) which were used to model dust storms in
    IRAQ. COAMPS predicted major dust events during
    OIF in a research-operational mode with focus on
    Iraq and the Gulf dust forecasts were used in
    Navys FNMOC daily weather discussion.
  • This capability was also used to forecast Iraq
    oil smoke during Operation Iraqi Freedom (OIF),
    having source and emission updated at analysis
    time in semi-operational runs.

Dust sources in Iraq area were identified at
1-km resolution based on various information, and
used in the operations for SW Asia, Spring 2003
17
Computational Simulations of Combustion Chamber
Dynamics and Hypergolic Gel Propellant Chemistry
for Selectable Engines in Next Generation Guided
Missiles
Emerging Army missile systems require on-demand
thrust selection for loiter and precision kill
missions. Simulations in support of these rocket
engine designs, including a new family of
hypergolic gel propellants, are achievable using
the proper combination of CFD, CCM, and HPCs.
M. Nusca and M. McQuaid, ARL, Aberdeen Proving
Ground, MD Sponsor Army
18
ARL CFD Simulation of the AMRDEC Vortex Engine
The ISVE Gel Bi-Propellant Rocket is designed to
be throttled. Shown below is an actual throttle
test. Shown to the right are results of a CFD
simulation for a notional throttle program.
Note hypergolic fuel re-light (i.e., diffusion
and then regeneration of combustion product OH).
19
Modeling Physics of Blast/Structure Interaction
for Anti-Terrorism
20
Pentagon Interior Damage
Typical 3rd Floor Views
Not Renovated 90m North of Impact
Renovated with ERDC Technology 17m North of Impact
21
U.S. Embassy in Israel
Blast Resistant
Column Wrap
Wall
Retrofits
22
Joint Common Missile ProgramSimulation Based
Acquisition
  • Joint Common Missile (JCM) system is the next
    generation air-to-ground missile that will be
    carried on rotary- and fixed-wing platforms of
    the U.S. Armed Forces.
  • The Army, Navy, and Marine Corps are expected to
    procure up to 54,000 JCM rounds to replace the
    Longbow/Hellfire missiles on the Apache, Cobra,
    and Strikehawk helicopters and the Maverick
    missile on the F/A-18 Hornet jet fighter, at a
    total contract lifetime value of approximately 5
    billion.
  • Advanced simulation and HPC played a critical
    role in the source selection process for the JCM.

23
Directed High Power RF Energy Foundation of
Next Generation Air Force Weapons
K. Cartwright, AFRL/DEHE, Kirtland AFB, NM
Sponsor Air Force
Given the importance of electronics in the modern
battlefield, the ability to target these
electronics will give the US a distinct advantage
against technically advanced adversaries.
MILO modeled in ICEPIC showing the generation of
RF field which are radiated from a Vlasov antenna
24
Computational Chemistry Models Leading to
Mediation of Gun Tube Erosion
C. Chabalowski, M. Hurley, and E. Byrd, ARL,
Aberdeen Proving Ground, MD D. Sorescu, DOE/NETL
Pittsburgh, PA R. Ellis, NSWC, Dahlgren, VA and
D. Thompson, Oklahoma State University,
Stillwater, OK Sponsor Army
  • Determine chemical mechanisms for the erosion of
    gun steel resulting from propellant combustion
    gases interacting with the steel, and suggest new
    propellant formulations to defeat erosion
  • Understanding the causes of Gun Tube Erosion
    requires massive super computational resources
  • First principles quantum mechanical methods and
    classical molecular dynamics simulations will be
    used to determine the chemical and structural
    changes in the steel, and predict the
    thermodynamic and kinetic characteristics of the
    chemical reactions responsible for these changes

Computer models of an carbon dioxide molecule
dissociating on a nitrided iron (100) surface
Armys Future Combat System and Navys
Advanced Gun System would benefit from this
study by using a new generations of propellants
with high flame temperatures
25
Computational Support for Chemically Reactive
Flows and Non-ideal Explosives
  • Perform complex, 3-D reactive chemistry
    calculations using the current models implemented
    in SHAMRC to support the thermobarics program
  • Perform first-principles, coupled 3-D
    CFD/chemistry calculations using SHAMRC and
    CHEMKIN to augment the chemical kinetics
    capability currently implemented in SHAMRC
  • Current 3-D calculations require 50 to 500
    millions zones, tens of thousands of hours of
    computer time and tens to hundreds of Gbytes of
    storage per calculation

J. Crepeau, ARA, Albuquerque, NM Sponsor DTRA
Fireball from a non-ideal explosive detonated in
a concrete structure
Implement and utilize a powerful computational
tool using HPC resources for the rapid evaluation
and development of efficient munitions for the
DoD
26
High Fidelity Analysis of UAVs Using Nonlinear
Fluid/Structure Simulation
R. Melville and M. Visbal, AFRL/VA, WPAFB,
OH Sponsor Air Force
  • Highly accurate simulation of UAV systems with
    nonlinear flow and flexible vehicles
  • UAV performance is limited by nonlinear flow
    features like shocks, separation, and turbulence.
    Overset LES solvers used to simulate the correct
    flow physics.
  • Flexible flight vehicles respond to dynamic
    fluid load. Nonlinear structural models included
    to represent the elastic motion.

Identify critical flight conditions, explore
nonlinear response, improve performance of future
UAV systems
27
Time-Accurate Aerodynamics Modeling of Synthetic
Jets for Projectile Control
J. Sahu, ARL, Aberdeen Proving Ground, MD S.
Chakravarthy, Metcomp Technologies, Westlake
Village, CA and S. Viken, NASA Langley,
Hampton, VA Sponsor Army
  • Predict and characterize, by time accurate CFD
    computations the unsteady nature of the synthetic
    jet interaction flow field produced on spinning
    projectiles for infantry operations
  • Modeling of multiple azimuthally-placed MEMS
    synthetic jets requires tremendous grid
    resolution coupled with large demands of special
    boundary conditions and hybrid RANS/LES CFD
    approach
  • Full scale second order time accurate CFD
    simulations are performed with turbulence model
    using RANS and hybrid RANS/LES approach.
    ZNSFLOW, CFD, and NASA CFD codes will also be
    used.

Provide a new guidance law for projectiles with
large savings in total required impulse and
required definition of forces and moments
28
Towards Predicting Scenarios of Environment
Arctic Change (TOPSEARCH)
  • Model the coupled ice-ocean Arctic environment at
    increasingly high resolutions to predict short-
    to long-term Arctic Sea Ice and Ocean conditions
  • Use PIPS and POPS to run multi-decade predictions
    of environmental response to prescribed realistic
    atmospheric forcing
  • Develop an eddy-resolving model of the Arctic
    Ocean that properly represents the smallest
    spatial scale of oceanic instabilities

W. Maslowski, NPS, Monterey, CA Sponsors NSF,
ONR
Snapshots of (a) sea ice area () and drift
(m/s), (b) divergence (1.e3/s), (c) shear
(1.3/s), and (d) vorticity (1.e3/s) for August
01, 1979 Stand Alone PIPS 3.0 Model Spinup
Predictions of scenarios of environmental Arctic
change may aid in planning future Naval
operations in that region
29
(No Transcript)
30
Resource Management Integrated
Requirements/Allocation/Utilization Process
  • Requirements Process
  • Bottoms-up survey
  • Includes only approved funded ST/TE projects
  • Reviewed and validated by ST/TE executives

Operations Decisions Acquisition Decisions
Requirements data Initial request for allocation
Feedback to help quantify requirements
Utilization feedback for management oversight and
further allocation
31
Requirements Analysis Process
  • Objective
  • Ensure accurate HPC requirements are documented
    in a timely manner to impact program planning
    decisions
  • Conduct a thorough and rigorous annual
    requirements analysis process to ensure accuracy
  • Process
  • Questionnaire - probes all aspects of HPC
    requirements
  • Interviews - allow face-to-face clarification of
    detailed requirements
  • Service validation - ensures that only
    approved/funded projects included
  • Requirements analysis database - detailed profile
    of user base and its requirements

32
HPCMP Resource Allocation PolicyCapacity
Allocations
  • Goal ensure that DoDs most important
    computational work gets done in a timely manner
  • 75 of total computational resources allocated by
    individual Services/Agencies
  • 25 of total computational resources allocated to
    DoD Challenge Projects
  • Represent DoDs very large, highest priority
    projects able to efficiently use largest systems
  • Approximate minimum threshold of 25 GF-yrs per
    year
  • Targeted for outstanding turnaround time
  • May require resources across multiple
    systems/centers

Focus on Capacity ? Capacity Allocations
33
HPCMP Resource Allocation PolicyCapability
Allocations
  • How
  • New systems are generally
  • Introduced a few months before the end of the
    current fiscal year without formal allocation
  • World class systems (usually in top five of
    class)
  • Dedicate large new systems to short-term, massive
    computations that generally cannot be addressed
    under normal shared resource operations
  • HPCMP issues call for Capability Application
    Project (CAP) proposals
  • Capability Application Projects are implemented
    between July and December on large new systems
    each year
  • Proposals are required to show that the
    application efficiently used on the order of
    1,000 processors or more and would solve a very
    difficult, important short-term computational
    problem

Goal Support the top capability work
34
FY 2005 Resource Allocations
  • All HPC systems at MSRCs and selected DCs
    allocated
  • Service/Agency allocations
  • Nominal allocation fraction (30/30/30/10) will be
    provided to a Service/Agency only if they have
    sufficient validated requirements for a
    particular system
  • Six priority classes of computational work on
    allocated systems
  • Urgent (U) - unforeseen time-critical jobs
  • Debug (D)
  • High-Priority (H) - time-critical jobs known in
    advance
  • Standard (S)
  • Background (B)
  • Capability Applications (Z)

35
Service/Agency Approval Authorities (S/AAAs)
  • Allocate Service/Agency controlled CPU hours on
    HPCMP systems
  • Provides day-to-day linkage with customer
    organizations, the HPCMP and other S/AAAs
  • Ensure Project Leaders and authorized users are
    performing work in support of DoD
  • Implement requirements surveys, resource
    allocation, resource monitoring, and resource
    reallocation
  • Provide guidance to users on which HPC assets are
    appropriate for their projects
  • Trades allocations with other S/AAAs

36
DoD ST and TE Growth in Computational Capability
37
HPCMP Center Resources
38
Major Shared Resource Centers (MSRCs)
  • Major Shared Resource Centers provide
  • Complete networked HPC environments
  • World-class HPC compute engines
  • High-end scientific visualization
  • Massive hierarchical storage
  • Proactive and in-depth user support/computational
    technology area expertise
  • to nation-wide user communities

39
Major Shared Resource Centers
  • Aeronautical Systems Center (ASC),
    Wright-Patterson AFB, OH
  • Army Research Laboratory (ARL), Aberdeen Proving
    Ground, MD
  • Engineer Research and Development Center (ERDC)
    at Waterways Experiment Station, Vicksburg, MS
  • Naval Oceanographic Office (NAVO), Stennis Space
    Center, MS

40
HPCMP Systems (MSRCs)
FY 01 and earlier FY 02 FY 03 FY 04 FY
05 Retired Downgraded
As of March 2005
41
Allocated Distributed Centers and Dedicated HPC
Project Investments
  • Allocated Distributed Centers (ADCs)
  • Allocated associates the annual service/agency
    project allocation and/or challenge project
    allocation process with these centers
  • Dedicated HPC Project Investments (DPs)
  • Dedicated associates the dedicated project or
    projects for which each independent DP was
    awarded with their respective center

42
HPCMP Systems (ADCs)
FY 01 and earlier FY 02 FY 03 FY 04 upgrades FY
05 Retired Downgraded
As of March 2005
43
Alternative to Limited Access Centers Open
Research Support
  • Systems at ARSC are operating in Open Research
    mode of operation
  • Eliminate the requirement for users of that
    system to have NACs
  • Customers have to certify that their work is
    unclassified non-sensitive (e.g., open
    literature, basic research)
  • All other operational policies apply, such as all
    users of HPCMP resources must be valid DoD users
    assigned to a DoD computational project
  • Consistent with Uniform Use-Access Policy
  • Certification statement and process available on
    the HPCMP web site

Open Research Systems provide the opportunity for
faster access by graduate students
44
Dedicated HPC Project Investments
  • Support Service/Agency prioritized project
    science needs
  • Provide a specialized HPC capability to a local
    component of the HPC community
  • Located where there is a significant reason for
    having a local HPC system
  • Cultivate existing local infrastructure and
    expertise
  • Support a specific project/mission with a local
    real-time capability, special operational
    considerations, technology investigations, or
    other special access needs

45
Dedicated HPC Project Investments
  • Real-Time Data Warehousing, On Line Analytical
    Processing, and Data Mining Technologies, ATC,
    Aberdeen Proving Ground, MD
  • HPC Modernization for Space-based Radar, AFRL/IF,
    Rome, NY
  • Applied Computational Fluid Dynamics (CFD) in
    Support of Aircraft-Store Certification, AFSEO,
    Eglin AFB. FL
  • Joint Operational Test Bed for the Weather
    Research and Forecast (WRF) Modeling Framework,
    AFWA, Offutt AFB, NE
  • Integrated Test Evaluation Flight Dynamics
    Test Support, AEDC, Arnold AFB, TN
  • Joint Operational Test Bed for the Weather
    Research and Forecast (WRF) Modeling Framework,
    FNMOC, Monterey, CA
  • Distributed Continuous Experimentation
    Environment, JFCOM/J9, Suffolk, VA

46
Dedicated HPC Project Investments
  • Maui Space Serveillance System Advanced Image
    Reconstruction, MHPCC, Kihei, HI
  • Joint Strike Fighter Project, NAWCAD/SIMAF,
    Patuxent River, MD
  • Concurrent Computation and Visualization
    Environment (CCVE), NSWCCD, West Bethesda, MD
  • Torpedo Hardware-In-The-Loop (HWIL) Modeling
    Simulation (MS), NUWC , Newport, RI
  • Advanced Dynamic Scene Projection, RTTC,
    Huntsville, AL
  • Virtual Electronic Battlefield (VEB), SSCSD, San
    Diego, CA
  • Real-Time Multi-Frame Blind Deconvolution, WSMR,
    White Sands Missile Range, NM

47
Defense Research Engineering Network
(DREN) is chartered to provide wide area network
(WAN) services for the DoD Science Technology,
Test Evaluation, and Missile Defense Agency
High Performance Computing (HPC) communities
  • Phase I (FY9498)
  • Interim Defense Research Engineering Network
    (IDREN)
  • Goal Quickly provide basic WAN services for HPC
    Community
  • Created by linking Army, Air Force, Navy and the
    Defense Special Weapons Agency (DSWA) sites
  • Government owned/operated
  • Phase II (FY9702)
  • Defense Research Engineering Network (DREN)
  • Goal Provide long term high performance (DS-3
    through OC-48) WAN services
  • Uses a commercial service contract awarded to
    ATT
  • No government ownership/operation
  • Phase III (FY0211)
  • Defense Research Engineering Network (DREN)
  • Goal Provide long term high performance (DS-3
    through OC-768) WAN services
  • Secure, virtual private network over commercial
    grid

48
Defense Research Engineering Network (DREN)
Overview
  • 10 year service contract for connectivity
  • Bandwidth from DS-3 to OC-768
  • SDP plug on the wall concept for users
  • IPSEC encryption on most point-to-point
    connections.
  • DREN contract managed by DITCO
  • Vendor management of core network and customer
    support center (DREN NOC)
  • Government monitoring by HPC CERT
  • MPLS and IPSec tunnels with stringent latency,
    packet loss, throughput requirements
  • High speed interfaces including GigE and 10GigE
  • Native IPv6 and multicast

49
(DREN) IPv6 Pilot Implementation
  • DoD Chief Information Officer set Department-wide
    Internet Protocol version 6 (IPv6) policy, June
    2003
  • DREN is the only designated network to provide
    early IPv6 support
  • IPv6 connectivity across DREN and peer networks
    is available to any DREN user site now (but only
    upon request)
  • All MSRCs, Allocated DCs, and the DREN NOC will
    be provide connectivity and support
  • For more information
  • https//kb.v6.dren.net web site (DoD CAC access
    controlled)
  • IPv6-pilot-team_at_hpcmo.hpc.mil
  • Handout from HPC UGC 2004 available at this
    meeting
  • See HPCMP web site under DREN (What is the DREN
    IPv6 pilot?)

Defense Research Engineering Network
(DREN) High Performance Network (IPv6)
50
(No Transcript)
51
SDP Architecture
  • Type A - IP
  • Interfaces
  • 10/100 Ethernet
  • 1/10 Gigabit Ethernet

Service Delivery Point (SDP)
  • Type B - ATM
  • Interfaces
  • OC-3, OC-12
  • Multimode and single mode

Commercial Grid
Sustained 50 of line speed
Peak 80 of line speed
SDP Equipment
  • Type C - Lightwave
  • Interfaces
  • TBD

Commercial Service Helps Build National
Information Infrastructure
52
DREN WAN Services
  • Types of Service
  • TYPE A Internet Protocol (IP), (45Mbps to
    Multi-Gigabit)
  • TYPE B Asynchronous Transfer Mode
    (ATM) (fractional DS-3 to OC-48)
  • TYPE C Lightwave

Speed of Service
53
DREN Security Concept
HPCMPO
Boundary Protection
INFOCON NETCON
DoD
DoD Ports Protocol Filtering
Enclave Protection
Enclave Protection
HPCMP HPCC
Periodic evaluation of system/network
vulnerabilities
User Site
Workstation Protection
Users
54
HPCMP Intrusion Detection Strategy
  • Network Intrusion Detection System (NIDS)
  • Hardware gt2 GHz Multiprocessor PC (Linux)
  • Software Joint Intrusion Detection Software
    (JIDs), SNORT, IWATCH
  • Upgrades developed to exploit multiprocessor
    capability and enable near-real-time monitoring
  • Support for OC-3 and OC-12 ATM interface cards
  • Gig Ethernet interface support
  • HPCMP CERT
  • Operated by ARL
  • Central management

55
HPCMP Intrusion Detection Strategy
  • Implementation at all HPC Centers to protect
    shared resources
  • Implementation at all peering (including NIPRNET
    Peering) locations to provide user site
    protection
  • Areas of interest

56
Software Applications Support
PET Partners
HPC Software Applications Institutes
  • Transfer of new technologies from universities
  • On-site support
  • Training

Fellows
  • Lasting impact on services
  • High value service programs

NDSEG
Interns
Software Protection
Growing Our Future
HPC Software Portfolios
  • Tightly integrated software
  • Address top DoD ST and TE problems
  • Assure software intended use/user
  • Protect software through source insertion

57
(No Transcript)
58
(No Transcript)
59
HPC Computational Fellowships
  • Patterned after successful DOE fellowship program
  • National Defense Science and Engineering Graduate
    Fellowship Program (NDSEG) chosen as vehicle for
    execution of fellowships
  • HPCMP added as fellowship sponsor along with
    Army, Navy, and Air Force
  • Computer and computational sciences added as
    possible discipline
  • HPCMP is sponsoring 11 fellows for 2004 and
    similar numbers each following year
  • HPCMP fellows will be strongly encouraged to
    develop close ties with DoD laboratories or test
    centers, including summer research projects
  • HPCMP fellows will be strongly encouraged to
    participate in HPCMP Users Group Conference each
    year
  • Eleven HPCMP fellows plus eight adjunct HPCMP
    fellows invited to 2004 Users Group Conference
    five participated
  • User organizations have responded to DUSD (ST)
    memo with fellowship POCs to select and interact
    with fellows

60
PET Interns and Summer Institutes
  • PET sponsored 25 students from 21 undergraduate
    institutions in the 2004 PET Summer Intern
    Program at ASC, ARL, ERDC, and NAVO. They worked
    on projects with mentors on priority DoD areas.
  • PET sponsored summer institutes June August at
    U. of Hawaii, Central State Univ., and Jackson
    State Univ. The undergraduate students from
    these institutions were introduced to HPC and how
    it can be used to effectively determine solutions
    for todays large computational problems.
  • A primary purpose of the intern program and
    summer institutes is to foster a pipeline into
    DoD HPC-related employment.

61
Programming Environment Training (PET)
  • Mission Enhance the productivity of the DoD HPC
    user community
  • Goals
  • Transfer leading edge HPC (computational and
    computing) technology into DoD from other
    government, industrial, and academic HPC
    communities
  • Develop, train, and support new and existing DoD
    HPC users with the education, knowledge access,
    and tools to maximize productivity

62
PET Success Collaboration with Predator Project
Predator Program Required Survivability analysis
(Vulnerability) through RCS data from
Xpatch The HPCMPO and PET involvement are
allowing a much higher fidelity and more timely
RF signature prediction for the Predator.
Without the support of the HPCMPO compute power,
it would take us months, if not years, to provide
such a detailed signature prediction to the
Predator Program Office. These results help the
Predator Program Office ensure that the Predator
will be properly employed in operational support
of the warfighter. Richard Graeff ASC/HPMT,
February 19, 2004
UNCLASSIFIED (PUBLIC DOMAIN) DATA
63
PET Team Members
64
Software Protection
  • Background With the loss of effective export
    controls on high performance computers, DoD must
    focus on protections on national security
    applications software
  • Approach
  • Software Protection Center Develop software
    protection technologies support of the
    insertions of software protections into
    applications codes work with Services and
    Agencies to improve the effectiveness of export
    control regulations for software serve as an
    information center for best practices,
    guidelines, etc.
  • Red Team Test and validate protection
    technologies and implementations

Objective Radar signature of a tank
Objective Analyze air flow over an F-18
65
Opportunity
  • Establish and Run the Comprehensive
    Supercomputing Computational Science User
    Advocacy Group
  • Need
  • The National High Performance Computing and
    Computational Science community needs a forum
    with broad representation from academia,
    government and industry, to include both
    industrial partners using HPC as well as those
    producing HPC products, for the purpose of freely
    exchanging and debating technical issues in HPC
    and computational science. A major part of the
    discussion and debate is anticipated to focus on
    technical issues in HPC and computational science
    that inhibit the effective use of supercomputers
    by users. Information sharing in specific
    technical fields of importance to multiple
    organizations is a key benefit. Once major
    issues are determined by the user organization,
    one of the prime functions of the organization
    is to make those issues known to key decision
    makers.

66
HPC Modernization ProgramHPC is Essential to
Future Technology Development
Write a Comment
User Comments (0)
About PowerShow.com