Methods for Developing Input Distributions for Probabilistic Risk Assessments - PowerPoint PPT Presentation

1 / 47
About This Presentation
Title:

Methods for Developing Input Distributions for Probabilistic Risk Assessments

Description:

Higher average temperatures (location-specific) 5 ... Damage to infrastructure because of temperature extremes (e.g., rail kinks, pavement damage) ... – PowerPoint PPT presentation

Number of Views:61
Avg rating:3.0/5.0
Slides: 48
Provided by: itre
Category:

less

Transcript and Presenter's Notes

Title: Methods for Developing Input Distributions for Probabilistic Risk Assessments


1
Incorporating Risk and Uncertainty into the
Assessment of Impacts of Global Climate Change on
Transportation Systems
H. Christopher Frey, Ph.D. Professor
Department of Civil, Construction, and
Environmental Engineering North Carolina State
University Raleigh, NC 27695 Prepared for
2nd Workshop on Impacts of Global Climate
Change On Hydraulics and Hydrology and
Transportation Center for Transportation and the
Environment Washington, DC March 29, 2006
2
Outline
  • Risk and Uncertainty
  • Overview of impacts of climate change on
    transportation systems
  • Risk assessment methodologies
  • Uncertainty analysis methodologies
  • Qualitative assessments
  • Recommendations

3
Definitions
  • Risk Probability and severity of an adverse
    outcome
  • Uncertainty Lack of knowledge regarding the
    true value of a quantity

4
POSSIBLE IMPACTS OF GLOBAL CLIMATE CHANGE ON
TRANSPORTATION SYSTEMS
  • All modes
  • highway, rail, air, shipping, pipeline,
    pedestrian
  • Passenger and freight
  • Possible climate impacts (natural processes)
  • Sea-level rise
  • Increased frequency and severity of storms
  • Higher average temperatures (location-specific)

5
Implications of Possible Climate Change (Effects
Processes)
  • Loss of coastal land area
  • Damage to infrastructure via storms (e.g., winds,
    flooding)
  • Damage to infrastructure because of temperature
    extremes (e.g., rail kinks, pavement damage)
  • Impede operations and safety
  • Design, construction, operation, maintenance,
    repair, decommissioning

6
METHODOLOGICAL FRAMEWORKS FOR DEALING WITH RISK
  • Vulnerability or hazard assessment
  • Exposure assessment
  • Effects processes
  • Quantification of risk
  • Risk management

7
Vulnerability Assessment
  • Physical, social, political, economic, cultural,
    and psychological harms to which individuals and
    modern societies are susceptible (Slovic, 2002).
  • Identify valuable targets at risk
  • Conceptualize various ways in which they are
    vulnerable to such an attack by defining various
    scenarios.
  • Clearly state the scale and the scope of the
    analysis (e.g., the world, a country, or specific
    region) considering that the risk assessment
    process will become easier as the scope narrows
    down.
  • Does not include assessment of the likelihood of
    such an event.
  • For example, coastal cities are vulnerable to the
    effects of sea level rise.

8
Paradigm for Human Health Risk Assessment (NRC,
1983)
Research
Risk Assessment
Laboratory and Field Work
Hazard Identification
Regulatory Options
Risk Characterization
Extrapolation Methods
Dose-Reponse Assessment
Evaluations of Options
Exposure Assessment
Decisions and Actions
Field Measurements, Modeling
9
An Alternative View of Human Health Risk
Assessment (PCRARM, 1997)
10
Example of A General Risk Assessment Framework
(Morgan)
Natural Environment
Exposure of objects and processes in natural and
human environment to the possibility of change
Effects on objects and processes in the natural
and human environment
Human Perceptions of exposures and of effects
Natural Processes
Costs and Benefits
Exposure Processes
Effects Processes
Human Perception Processes
Human Evaluation Processes
Human Activities
Human Environment
11
Risk Analysis and Risk Management
  • Analysis should be free of policy-motivated
    assumptions
  • Yet, analysis should include scenarios relevant
    to decision-making
  • Some argue for analysts and decision makers to be
    kept apart to avoid biases in the analysis
  • Others argue that they must interact in order to
    define the assessment objective
  • A practical, useful analysis needs to balance
    both concerns

12
Realities of Decision-Making
  • Decision-making regarding response to the impacts
    of climate change will involve
  • multiple parties
  • a local context
  • considerations beyond just the science and
    technology (such as equity, justice, culture, and
    others) and
  • implications for potentially large transfers of
    resources among different societal stakeholders.
  • Such decision-making may not produce an optimal
    outcome when viewed from a particular (e.g.,
    national, analytical) perspective.

Based on Morgan (2003)
13
METHODOLOGICAL FRAMEWORKS FOR DEALING WITH
UNCERTAINTY
  • Role of uncertainty in decision making
  • Scenarios
  • Models
  • Model inputs
  • Empirically-based
  • Expert judgment-based
  • Model outputs
  • Other quantitative approaches
  • Qualitative approaches

14
Uncertainty and Decision Making
  • How well do we know these numbers?
  • What is the precision of the estimates?
  • Is there a systematic error (bias) in the
    estimates?
  • Are the estimates based upon measurements,
    modeling, or expert judgment?
  • How significant are differences between two
    alternatives?
  • How significant are apparent trends over time?
  • How effective are proposed control or management
    strategies?
  • What is the key source of uncertainty in these
    numbers?
  • How can uncertainty be reduced?

15
Implications of Uncertainty in Decision Making
  • Risk preference
  • Risk averse
  • Risk neutral
  • Risk seeking
  • Utility theory
  • Benefits of quantifying uncertainty Expected
    Value of Including Uncertainty
  • Benefits of reducing uncertainty Expected Value
    of Perfect Information

16
Framing the Problem Objectives and Scenarios
  • Need a well-formulated study objective that is
    relevant to decision making
  • A scenario is a set of structural assumptions
    about the situation to be analyzed
  • spatial and temporal dimensions
  • specific hazards, exposures, and adverse outcomes
  • Typical errors description, aggregation, expert
    judgment, incompleteness
  • Failure to properly specify scenario(s) leads to
    bias in the analysis, even if all other elements
    are perfect.

17
Model Uncertainty
  • A model is a hypothesis regarding how a system
    works.
  • Ideally, the model should be tested by comparing
    its predictions with observations from the real
    world system, under specified conditions.
  • Difficult for unique or future events.
  • In practice, validation is often incomplete.
  • Extrapolation.
  • Other factors simplifications, aggregation,
    exclusion, structure, resolution, model
    boundaries, boundary conditions, and calibration.

18
Examples of Alternative Models
State Change?
Sublinear
System Response
Linear
Threshold
Superlinear
Explanatory Variable
19
Model Uncertainty Climate Change Impacts
  • Enumeration of a set of plausible or possible
    alternative models,
  • Comparisons of their predictions or development
    of a weighting scheme to combine the predictions
    of multiple models into one estimate
  • It seems inappropriate to increase the complexity
    of the analysis in situations where less is known
    (Casman et al., 1999)

20
Model Uncertainty
Model 1
w1
Model 2
w2
w3
Model 3
Weighted Combination Of Model Outputs
21
The Role of Models When Structural Uncertainties
are Large
  • Assessment of climate change impacts involves
    many component models
  • Some are better than others, and they degrade
    at different rates as one goes farther into the
    future.
  • For problem areas in which there is little
    relevant data, theory, or experience, a simpler
    order-of-magnitude model may be adequate.
  • For problem areas in which little is known, very
    simple bounding analyses may be all that can be
    justified.
  • For poorly supported models, it is no longer
    possible to search for optimal decision
    strategies. Instead, one can attempt to find
    feasible or robust strategies

22
Quantification of Uncertainty in Inputs and
Outputs of Models
Input Uncertainties
Output Uncertainty
Model
23
Statistical MethodsBased Upon Empirical Data
  • Frequentist, classical
  • Statistical inference from sample data
  • Parametric approaches
  • Parameter estimation
  • Goodness-of-fit
  • Nonparametric approaches
  • Mixture distributions
  • Censored data
  • Dependencies, correlations, deconvolution
  • Time series, autocorrelation

24
Statistical Methods Based on Empirical Data
  • Need a random, representative sample
  • Not always available when predicting events into
    the future

25
Example of an Empirical Data Set Regarding
Variability
Empirical Quantity
26
Fitted Lognormal Distribution
Empirical Quantity
27
Bootstrap Simulation to Quantify Uncertainty
Empirical Quantity
28
Results of Bootstrap Simulation Uncertainty in
the Mean
0.06
Empirical Quantity
Uncertainty in mean -73 to 200
29
Estimating Uncertainties Based on Expert Judgment
  • Probability can be used to quantify the state of
    knowledge (or ignorance) regarding a quantity.
  • Bayesian methods for statistical inference are
    based upon sample information (e.g., empirical
    data, when available) and a prior distribution.
  • A prior distribution is a quantitative statement
    of the degree of belief a person has that a
    particular outcome will occur.
  • Methods for eliciting subjective probability
    distributions are intended to produce estimates
    that accurately reflect the true state of
    knowledge and that are free of significant
    cognitive and motivational biases
  • Useful when random, representative data, or
    models, are not available, but when there is some
    epistemic status upon which to base a judgment

30
Heuristics and Possible Biases in Expert Judgment
  • Heuristics and Biases
  • Availability
  • Anchoring and Adjustment
  • Representativeness
  • Others (e.g., Motivational, Expert, etc.)
  • Consider motivational bias when choosing experts
  • Deal with cognitive heuristics via an appropriate
    elicitation protocol

31
An Example of an Elicitation ProtocolStanford/SR
I Protocol
32
Frequently Asked Questions Regarding Expert
Elicitation
  • How to choose the experts
  • How many experts are needed
  • Whether to perform elicitation individually or
    with groups of experts
  • Elicitation of correlated uncertainties
  • What to do if experts disagree
  • Whether and how to combine judgments from
    multiple experts
  • What resources are needed for expert elicitation

33
Propagating Uncertainties Through Models
  • Analytical solutions exact but of limited
    applicability
  • Approximate solutions more broadly applicable
    but increase in complexity or error as model and
    inputs become more complex (e.g., Taylor series
    expansion)
  • Numerical methods flexible and popular (e.g.,
    Monte Carlo simulation)

34
Monte Carlo Simulation and Similar Methods
F(x) Pr(xX)
35
Sensitivity Analysis Which Model Inputs
Contribute Most to Uncertainty in Output?
  • Linearized sensitivity coefficients
  • Statistical methods
  • Correlation
  • Regression
  • Advanced methods

Example from Sobols Method

36
Other Quantitative Methods
  • Interval Methods Provide bounds, but not very
    informative
  • Fuzzy Sets represents vagueness, rather than
    uncertainty

37
Qualitative Methods
  • Principles of Rationality
  • Lines of Reasoning
  • Weight of Evidence

38
Principles of Rationality
  • Conceptual clarity well-defined terminology
  • Logical consistency inferences should follow
    from assumptions and data
  • Ontological realism free of scientific error
  • Epistemological reflection evidential support
  • Methodological rigor use of proven techniques
  • Practicality
  • Valuational selection focus on what matters the
    most

39
Lines of Reasoning
  • Direct empirical evidence
  • Semi-empirical evidence (surrogate data)
  • Empirical correlations (relationships between
    known processes and the unknown process of
    interest)
  • Theory-based inference causal mechanisms
  • Existential insight expert judgment

40
Judgment of Epistemic Status
  • The result of an analysis of epistemic status is
    a judgment regarding the quality of each premise
    or alternative e.g.,
  • no basis for using a premise in decision-making.
  • partial or high confidence basis for using a
    particular premise as the basis for decision
    making.

41
Weight of Evidence
  • Legal context - whether the proof for one premise
    is greater than for another.
  • Often used when a categorical judgment is needed.
  • However,
  • tends to be less formal than the analysis of
    epistemic status,
  • less transparent than properly documented
    analyses of epistemic status

42
Qualitative Statements Regarding Uncertainty
  • Qualitative approaches for describing uncertainty
    are best with fundamental problems of ambiguity.
  • The same words mean
  • different things to different people,
  • different things to the same person in different
    contexts
  • Based on Wallsten et al., 1986
  • Probable was associated with quantitative
    probabilities of approximately 0.5 to 1.0
  • Possible was associated with probabilities of
    approximately 0.0 to 1.0.
  • Qualitative schemes for dealing with uncertainty
    are typically not useful

43
CONCLUSIONS - 1
  • There is growing recognition that climate change
    has the potential to impact transportation
    systems.
  • The available literature on the impacts of
    climate change on transportation systems appears
    to be a vulnerability assessment, rather than a
    risk analysis.

44
CONCLUSIONS - 2
  • The commitment of large resources should be based
    on, as thoroughly as necessary or possible, a
    well-founded analysis.
  • There are many alternative forms of analysis that
    differ in their epistemic status, depending on
    what type of information is available.
  • Thus, the key question is what kind of analysis
    is appropriate here?
  • It may be possible to seek feasible, and perhaps
    robust (but not optimal) solutions for dealing
    with climate change impacts.
  • Actual decisions will be based on a complex
    deliberative process, to which analysis is only
    one input

45
CONCLUSIONS - 3
  • There is substantial uncertainty attributable to
    the structure of scenarios and models.
  • Given the lack of directly relevant empirical
    data for making assessments of future impacts,
    there is a strong need for the use of judgments
    regarding uncertainty elicited from experts

46
RECOMMENDATIONS
  • Vulnerability assessment is only a first step.
  • Modeling tools should be used to identify
    feasible and robust solutions
  • Assessment should be done iteratively over time.
  • Expert judgment should be included as a basis for
    quantifying the likelihood and severity of
    various outcomes, as well as uncertainties.
  • Uncertainties should be quantified to the extent
    possible.
  • Sensitivity and uncertainty analysis should be
    used together to identify key knowledge gaps that
    could be prioritized for addition data collection
    or research in order to improve confidence in
    estimates.
  • In order to focus policy debate and inform
    decision making, these analyses are highly
    recommended, despite their limitations

47
ACKNOWLEDGMENTS
  • Hyung-Wook Choi, of the Department of Civil,
    Construction, and Environmental Engineering at NC
    State, provided assistance with the literature
    review.
  • This work was supported by the Center for
    Transportation and the Environment. However, the
    author is solely responsible for the content of
    this material.
Write a Comment
User Comments (0)
About PowerShow.com