Discrete Event Simulation Ch. 1 - PowerPoint PPT Presentation

1 / 23
About This Presentation
Title:

Discrete Event Simulation Ch. 1

Description:

Fortunately, not everything is lost: most model building ends with something ... unless you can prove it, there is no way you can feel safe from nasty surprises... – PowerPoint PPT presentation

Number of Views:94
Avg rating:3.0/5.0
Slides: 24
Provided by: giampier
Category:

less

Transcript and Presenter's Notes

Title: Discrete Event Simulation Ch. 1


1
  • Validity, Credibility and Details.

2
There is no way to guarantee that you have
modeled anything, and, if by some chance you
actually have, that your model has anything to do
with the original problem Rather depressing,
isn't it? Fortunately, not everything is lost
most model building ends with something useful,
and most efforts do capture some useful aspects
of the original problem - it is just that proving
you have done so is rather tricky, and, unless
you can prove it, there is no way you can feel
safe from nasty surprises and they abound Good
judgement comes form experience. Experience comes
from bad judgement...
3
Delivering an acceptable simulation model will
require at least three steps Verification the
process of determining whether the conceptual
simulation model has been correctly translated
into a properly functioning computer
program Validation the process of determining
whether a simulation model is an accurate
representation of the system, for the particular
objectives of the study Establishment of
Credibility the process of convincing yourself
and your customer that you have a "right" model
giving the "right" results.
4
All three processes are ongoing during the whole
development period, and at different levels -
steps refer to Sec. 1.7
Establishment of Credibility
Validation
Validation
Verification
Real System
Conceptual Model
Simulation Program
"Correct" results Available
Results used in Decision- making Process
Analysis and Data 1, 2, 3
Make model runs 5, 6, 7, 8, 9
Programming 4
Sell Results to Management 10
5
Steps in a Simulation Study - a Flowchart from
the text.
1. Formulate problem and plan the study
7. Design Experiments
2. Collect data and define the model
8. Make Production runs
3. Conceptual model valid?
9. Analyze output data
No
10. Document, present and use results
Yes
4. Construct a computer program and verify
5. Make Pilot Runs
6. Programmed model valid?
No
Yes
6
  • There might be a final Accreditation phase which
    involves several actions and the meeting of
    several criteria
  • The formal completion of verification and
    validation phases.
  • Model development and use history - who used it,
    who built it, who tested it, who asked for
    similar models
  • Quality of available data. This is more
    important than one might think, since most
    military applications will not use actual data
    during development secrecy - in that world - is
    very important and using actual data would
    require "development in a vault", with only
    cleared personnel involved.
  • Quality of the documentation. Very important if
    this model is expected to undergo further
    development and/or extensive use by inexperienced
    personnel.
  • Known problems and limitations.

7
Levels of Model Detail. Every model ignores a
large number of the aspects of the reality it
attempts to model. There is no way to avoid this,
and even the best efforts may not always succeed
in producing a model which is both sufficiently
accurate and useful. Example some of the early
weather prediction models were reasonably
accurate, but they required about 48 hours of
computer time for a 24-hour forecast. Not
useful, except to make sure you were on the right
track with your understanding of weather
models. Current models are much more detailed,
but the intervening increase in computational
power has more than offset the increased level of
detail. The major problem (time) has solved
itself - not really, but all you need is an
adequate weather forecast, not a perfect one.
8
1) Define the specific issues to be investigated
and the measures of performance that will be used
for evaluation. Easy to say, but an understanding
of issues and measures may not be available until
several iterations of the model conceptualizing
process have occurred. Sometimes you may actually
need to build it. Sometimes the people whose
input you need are not available or actively
involved in sabotaging the project. Example the
Environmental Protection Agency of the City of
New York (Garbage Collection, etc.) attempted to
change a number of work procedures and sent a
number of management analysts around the work
sites to study and recommend. They encountered
almost complete resistance and non-cooperation. So
lution a new batch of young, pretty, female
analysts was sent around. Their approach to the
"Garbagemen" was one of awe and helplessness and
they got the job done in time and with no
problems.
9
2) The conceptual entity going through the model
can be less detailed than the entities going
through the system - a box of crackers instead of
individual crackers a bottle of olive oil
instead of each individual molecule - if this is
not possible, the computational requirements may
overwhelm the simulation. 3) Use subject-matter
experts and sensitivity analysis. Make sure that
the subject-matter experts are experts this is
often not easy to determine, especially in
situations where something new or unusual is
being modeled. Sensitivity analysis should allow
you to determine which factors are most
important. 4) The level of detail must be
appropriate too detailed a model will result in
high costs, loss of interest on the part of
management, inability to match the model with
data available. Too little detail will result in
the customer not recognizing the entity you are
expected to model.
10
Verification Techniques. How do I determine
whether the program works? By working, I mean
that it executes the model, not necessarily the
reality. 1) Modularized design, testing and
debugging. Start from a simplified model and
add complexity as the modules are verified. A
complex function can temporarily be replaced with
a stub that returns some acceptable values. 2)
Structured walk-throughs. Or any other debugging
techniques that involve others. Get several
people committed to reviewing the code. 3) Run
the simulator under several setting and
sanity-check the output. Compare behavior to any
actual behavioral data you possess.
11
4) Trace the behavior. This is an execution
walk-through (rather than code walk-through) and
requires either a good debugger or the
preparation of tracing facilities for your
program. One of the difficulties with a standard
debugger is that the amount of information you
have to evaluate (interactively or otherwise) is
much too large. 5) Run the model under
simplifying assumptions. There are situations
where a simple version of the model is
analytically tractable, so you can predict output
parameters. If the output of the simulation
matches the predicted values, you have some
further assurances of program correctness.
Another possibility is that the analytical
predictions are very attractive, but you cannot
extend them to the complex model you need. Build
the model, verify the simple cases and go for
empirical validation of your hope...
12
6) Use Animation. This could be as simple as
displaying the behavior of time dependent
solutions of a system of Differential Equations
(e.g., the Lotka-Volterra ones in a previous
lecture) or as complex as what is presented in a
good interactive "shoot-them-up" game or
more. 7) Check that your synthetic inputs satisfy
known behavioral requirements. 8) Use a
simulation package - if appropriate. These are
all COMMON SENSE ideas - there is no formal
methodology that will guarantee good results.
13
Some Disadvantages of Simulation. 1) Each run of
a stochastic simulation model produces only one
estimate of the output parameters for a given set
of input parameters. Multiple runs with each
parameter setting will be needed to be able to
claim some confidence about the quantities being
estimated. Analytic models - if possible - can
provide "true" values for the output parameters
with just one "run". 2) Simulation models are
often expensive and time-consuming to develop. 3)
The sheer volume of numbers generated by a
simulation study (and, possibly, the ability to
play "what if" at will), may engender an
unjustifiable reliance on the results of the
model even when the model is no longer
applicable. (Diagnostic system twelve-year-old
car with rust spots was diagnosed as child with
measles.)
14
Validation and the Establishment of
Credibility. 1) Collect high-quality information
and data. a) Meet with subject-matter
experts. b) Observe the system. c) Become
familiar with relevant existing theory. d)
Become familiar with similar simulation
studies. e) Use your experience and intuition.
(see slide 2 for how you gain experience).
15
2) Interact with Management on a Regular
Basis. Whether the manager is capable of
contributing to the technical aspects of the
project or not, she can provide or withold the
resources necessary to make it successful.
Continuing interaction with management may also
permit you to assess whether agendas other than
technical ones are operative. Your study may have
been ordered to provide justification for prior
decisions rather than to provide the basis for
informed decisions. If the manager IS capable of
contributing to the technical aspects of the
project, this is the person whose opinion will
have the greatest weight in the acceptance or
rejection of the final product, and in the
availability of any resources that become
necessary after the initial specifications are
finalized.
16
3) Maintain an Assumptions Document. Perform
Structured Walk-throughs. Make sure that the
assumptions document is regularly shared with
both management and your domain experts. Make
sure that both take part in conceptual
walk-throughs (this is not a code document). This
may show - among other things - problems in
communication within your customer's organization
(which are more common than anyone is willing to
admit).
17
4) Validate Components of the Model by Using
Quantitative Techniques.
18
5) Validate Output from the Overall Simulation
Model. How can you show that the data you will
obtain as output from your model match the data
you would obtain from "reality" - within some
acceptable limits? If your model is an extension
of an existing - validated - one, then make sure
that the output of your model and that of the old
model match along all the parameters the two
share. If your model is that of an existing
reality, then you can compare the output of your
model to that of the existing reality. If your
model is an extension of an existing reality, you
can still compare the output along all the common
parameters. A possible test is a variant of the
Turing test for machine intelligence (or
"intelligence" in general). The original test,
proposed by A. Turing around 1950, tried to
determine whether an unknown entity possessed
some level of intelligence.
19
A communications link was set up (a keyboard and
output device) and an interaction was started
between a human "tester" and the entity being
tested. If, after a certain amount of time and
interaction, the human could not conclude that
the entity was unintelligent, it would, by
default, conclude that it was at least partially
intelligent. Another way of explaining this
test is if it looks like a duck, walks like a
duck, swims like a duck and quacks like a duck,
it's a duck. In this context, the test would be
run as follows take a number of people
knowledgeable about the system, and make them
compare output data from the system and from the
model (this assumes you have modeled an existing
system, to start with). If they can't tell the
difference, you have succeeded (as far as you can
tell) if they can, their description of how they
decided which was which would help you in the
next iteration of the model. The test should be
run blind, or, better yet, double blind.
20
6) Animation. This may be quite expensive,
especially if you have a complex model that
requires physically faithful 3-D
representation. On the other hand, a good
animation can capture problems that statistics
and a collection of numerical data files cannot.
21
Statistical Procedures. Suppose we have two
sequences of observations, R1, R2, .., Rn form
the real world and M1, M2, , Mn from our
simulation model. How can we compare them so
that we can determine whether the model is an
accurate representation of the real world? The
problem is that almost all real-world systems and
simulations are non-stationary and autocorrelated
- which means that any tests that assume an IID
property cannot be applied - at least not in a
way that could be used for justification.
22
1) Inspection Approach.
23
Confidence Interval Approach.
Write a Comment
User Comments (0)
About PowerShow.com