SemiAutomated Simulation Transformation for DDDAS - PowerPoint PPT Presentation

1 / 31
About This Presentation
Title:

SemiAutomated Simulation Transformation for DDDAS

Description:

David Brogan, Paul Reynolds, Robert Bartholet, Joseph Carnahan, Yannick Loiti re ... Improving the foundations of simulation technology for cross-cutting research ... – PowerPoint PPT presentation

Number of Views:31
Avg rating:3.0/5.0
Slides: 32
Provided by: dav5295
Category:

less

Transcript and Presenter's Notes

Title: SemiAutomated Simulation Transformation for DDDAS


1
Semi-Automated Simulation Transformation for DDDAS
  • David Brogan, Paul Reynolds, Robert Bartholet,
    Joseph Carnahan, Yannick Loitière
  • Computer Science Department
  • University of Virginia, USA

2
Simulations used for DDDAS
  • Carefully crafted for specific instances
  • Data mining and processing
  • Prediction
  • User insight generation
  • Control
  • Verification and validation

3
Cross-cutting needs
  • DDDAS is advancing in many ways (Darema 04)
  • Applications
  • Mathematical algorithms
  • Systems software
  • Measurements and sensors
  • Simulation software technology must keep pace and
    facilitate reuse

4
MaSTRI at UVa
  • Modeling and Simulation Technology Research
    Initiative
  • Improving the foundations of simulation
    technology for cross-cutting research
  • Aerospace engineering (combustion modeling)
  • High-energy physics (quark-gluon plasma)
  • Neuroscience (multi-scale model of hippocampus)
  • and your DDDAS research projects

5
My favorite simulations
  • Rigid-body simulations

6
So many characters
7
So many applications
8
Simulation reuse for DDDAS requires accommodating
change
  • Design time
  • Capturing and encoding designer insight
  • Composition time
  • What existing simulation components satisfy goals
  • Runtime
  • Steering a simulation while preserving stability
  • Status of our 2004 ITR

9
Addressing DDDAS at Design Time
  • Capturing and encoding the nature of unexpected
    conditions (not specifics)
  • Stipulate when simulation can/cannot be used
  • State when/how it can be changed
  • Critical dependencies
  • Formalization feasible because of common
    simulation features

10
Common simulation features
  • 1) Built upon simplifying assumptions
  • Frequently hidden
  • Small changes can invalidate simulation
  • Examples
  • Bounding the space and time of the simulation
  • Selecting equations to represent phenomena
  • Knowing 4th order Runge Kutta is good enough

Smiegel et al. submitted to Winter Simulation
Conference 2005
11
Flexible points
  • A language extension to capture designers
    insight
  • Identifies simulation features
  • Specifies how each feature can be changed
  • Predicts impact of changing each feature on
    simulation behavior
  • Flexible point alternatives can be explored
    automatically at runtime

12
Common simulation features
  • 2) Organized around events
  • Define set of possible events
  • Necessary conditions for event generation
  • Handling an events effects

13
Event-generating systems
  • Transform code into event model
  • Abstracts events from simulation code
  • Facilitates reasoning about event-based behavior
  • Defining simulation in terms of event generations
    makes opportunities for transformation clear

14
Common simulation features
  • 3) Time management
  • Event scheduling and ordering
  • Discrete, continuous, or both
  • Tied to event generation - deciding when the new
    event will occur

15
Interval timelines
  • Emphasize a simulations temporal aspects
  • Abstracts timing from simulation code
  • Use event calendars from discrete event
    simulation research
  • Temporal inferences and assertions about
    observable behavior
  • Simulation transformation accomplished through
    merges and shifts of intervals and timelines

16
Application to DDDAS
  • Automatic simulation transformation is difficult
  • Too many choices to search through
  • Information about flexible points can guide
    future searches
  • Focusing on necessary events narrows the search
  • Not enough information about how transformation
    will affect behavior
  • Design-time knowledge and temporal models

Carnahan et al. submitted to Winter Simulation
Conference 2005
17
Addressing DDDAS at Composition time
Component Selection (CS)
CS Is there a subset of X of cardinality k or
less that covers R?
COMPONENTS
REQUIREMENTS
Example instance when k 3
R
X
r4
CS is NP-complete Proof reduction from SAT (Page
and Opper 1999) and MSC (Petty et al. 2003) We
proved CS can be approximated using GREEDY
r1
r2
r3
r5
r6
r7
r8
Fox et al. Winter Simulation Conference 2004
18
Composability Assumptions
DECREE Component selection in the context of
simulation composability is inflexible
  • Components are immutable
  • There exists a master set of components from
    which all possible sets of requirements can be
    satisfied
  • Requirements are known a priori and do not change

19
Applied Simulation Component Reuse (ASCR)
In selecting components, we now have to
account for the cost or utility of adapting a
component!
  • Critical New Assumption Any component can be
    adapted to satisfy any requirement
  • What does this buy us?
  • We no longer have to assume the existence of a
    master set of components.
  • We can more flexibly react to changing
    requirements.
  • But

20
ASCR
  • A formal model and analysis of component
    selection with transformable components
  • What additional requirements could be satisfied?
  • What is value of approximate satisfaction?
  • What work is required for transformation?
  • What is lost?
  • Conflicts with other requirements
  • Conflicts with other components

21
Results
  • Proven ASCR brings flexibility to component
    selection, but the problem remains intractable
  • Ongoing work
  • Discovering scalable methods, algorithms, and
    heuristics for component selection
  • Encoding adaptability into the component

Bartholet et al. submitted to Winter Simulation
Conference 2005
22
Addressing DDDAS atRuntime
  • Data-driven simulation steering
  • How will high-dimensional data be handled?
  • Sparse samples
  • Difficult to characterize
  • How will entire simulation adapt to new data?
  • Must transformation be smooth?
  • Instantaneous injection of new data can disrupt
    simulation (invalidate high-order variables)
  • Are there constraints?

23
Applications
  • Replace (parts of) simulation with lower-order,
    data-driven model
  • Provide insight in the parameter space structure
  • e.g. tuning chemical reaction models
  • Simulation state interpolation
  • Steer towards desired simulation state by
    traversing between previous state observations

24
The curse of dimensionality
  • High-dimensional state spaces are difficult for
    users
  • Hard to visualize
  • Unintuitive to manipulate
  • Difficult for algorithms
  • Combinatorial explosion
  • Sparse coverage
  • Parameter interdependence

25
Dimensionality reduction
  • Necessary to make problems tractable
  • Build a low-dimensionality model
  • Must decide which dimensions to eliminate
  • Ideally a new vector base would be defined
  • May require non-linearity

26
Addressing sparsity/non-linearity
  • Rely on local state-space properties
  • Self-Organizing Maps (SOM)
  • Locally Linear Embedding (LLE)
  • Work in map space rather than in state space
  • Transformation simplifies interaction and
    automation

27
Using SOMs
  • Shape synthesis simulation
  • f(x) i
  • Observe many x to image pairs
  • Construct SOM from pairs
  • Example DDDAS challenge
  • Run time data requires steering current state i0
    to new state if over four discrete state changes
  • Avoid large steps

f (x1, , xn)
28
Using SOMs
  • Previously trainedSOM provides mapof dead zones
    andsweet spots forsimulation steering

29
Conclusions
  • Designing DDDAS-ready simulations that adapt to
    change
  • Language-level tools to capture insight
  • Dynamically composing semantically compatible
    simulations
  • Use data-driven models as means to accommodate
    runtime simulation steering

30
Acknowledgements
  • MaSTRI -- www.cs.virginia.edu/MaSTRI

31
Future work
  • Language tools
  • Applicability of model checking / static analysis
  • Automatic sensitivity analysis
  • Component Selection
  • Formalizing transformation utilities and costs
  • Simulation steering
  • Updating a trained SOM
  • Learning a simulations inverse output ? input
Write a Comment
User Comments (0)
About PowerShow.com