Lecture 16 Nonlinear Problems: Simulated Annealing and Bootstrap Confidence Intervals - PowerPoint PPT Presentation

About This Presentation
Title:

Lecture 16 Nonlinear Problems: Simulated Annealing and Bootstrap Confidence Intervals

Description:

Lecture 16 Nonlinear Problems: Simulated Annealing and Bootstrap Confidence Intervals – PowerPoint PPT presentation

Number of Views:125
Avg rating:3.0/5.0
Slides: 44
Provided by: BillM110
Category:

less

Transcript and Presenter's Notes

Title: Lecture 16 Nonlinear Problems: Simulated Annealing and Bootstrap Confidence Intervals


1
Lecture 16 Nonlinear ProblemsSimulated
Annealing and Bootstrap Confidence Intervals
2
Syllabus
Lecture 01 Describing Inverse ProblemsLecture
02 Probability and Measurement Error, Part
1Lecture 03 Probability and Measurement Error,
Part 2 Lecture 04 The L2 Norm and Simple Least
SquaresLecture 05 A Priori Information and
Weighted Least SquaredLecture 06 Resolution and
Generalized Inverses Lecture 07 Backus-Gilbert
Inverse and the Trade Off of Resolution and
VarianceLecture 08 The Principle of Maximum
LikelihoodLecture 09 Inexact TheoriesLecture
10 Nonuniqueness and Localized AveragesLecture
11 Vector Spaces and Singular Value
Decomposition Lecture 12 Equality and Inequality
ConstraintsLecture 13 L1 , L8 Norm Problems and
Linear ProgrammingLecture 14 Nonlinear
Problems Grid and Monte Carlo Searches Lecture
15 Nonlinear Problems Newtons Method Lecture
16 Nonlinear Problems Simulated Annealing and
Bootstrap Confidence Intervals Lecture
17 Factor AnalysisLecture 18 Varimax Factors,
Empircal Orthogonal FunctionsLecture
19 Backus-Gilbert Theory for Continuous
Problems Radons ProblemLecture 20 Linear
Operators and Their AdjointsLecture 21 Fréchet
DerivativesLecture 22 Exemplary Inverse
Problems, incl. Filter DesignLecture 23
Exemplary Inverse Problems, incl. Earthquake
LocationLecture 24 Exemplary Inverse Problems,
incl. Vibrational Problems
3
Purpose of the Lecture
Introduce Simulated Annealing Introduce the
Bootstrap Method for computing Confidence
Intervals
4
Part 1Simulated Annealing
5
Monte Carlo Methodcompletely undirectedNewton
s Methodcompletely directed
6
Monte Carlo Methodcompletely undirectedNewton
s Methodcompletely directed
slow, but foolproof
fast, but can fall into local minimum
7
compromise
  • partially-directed random walk

8
m(p)
Elow
Emedium
Ehigh
9
m(p)
p(mm(p))
Elow
Emedium
Ehigh
10
m
m(p)
Elow
Emedium
Ehigh
11
acceptance of m as m(p1) always accept in
error is smalleraccept with probabilitywhere
T is a parameterif error is bigger
12
large T
always accept m (undirected random walk) ignores
the error completely
13
small T
accept m only when error is smaller (directed
random walk) strictly decreases the error
14
intermediate T
most iterations decrease the error but
occasionally allow an m that increases it
15
large T undirected random walk
m(p)
Elow
Emedium
Ehigh
16
small T directed random walk
m(p)
Elow
Emedium
Ehigh
17
strategy
  • start off with large T
  • slowly decrease T during iterations

undirected similar to Monte Carlo method (except
more local)
directed similar to Newtons method (except
precise gradient direction not used)
18
strategy
  • start off with large T
  • slowly decrease T during iterations

more random
more directed
claim is that this strategy helps achieve the
global minimum
19
analogous to annealing of metals
  • high temperatures
  • atoms randomly moving
  • about due to thermal motions
  • as temperature decreases
  • atoms slowly find themselves in a
  • minimum energy configuration
  • orderly arrangement of a crystal

www.sti-laser.com/technology/heat_treatments.html
20
analogous to annealing of metals
  • high temperatures
  • atoms randomly moving
  • about due to thermal motions
  • as temperature decreases
  • atoms slowly find themselves in a
  • minimum energy configuration
  • orderly arrangement of a crystal

hence simulated annealing and T called
temperature
21
this is just Metroplois-Hastings
  • (way of producing realizations of a random
    variable)
  • applied to the p.d.f.

22
this is just Metroplois-Hastings
  • (way of producing realizations of a random
    variable)
  • applied to the p.d.f.

sampling a distribution that starts out wide and
blurry but sharpens up as T is decreases
23
(No Transcript)
24
  • for k 1Niter
  • T 0.1 Eg0 ((Niter-k1)/Niter)2
  • ma(1) random('Normal',mg(1),Dm)
  • ma(2) random('Normal',mg(2),Dm)
  • da sin(w0ma(1)x) ma(1)ma(2)
  • Ea (dobs-da)'(dobs-da)
  • if( Ea lt Eg )
  • mgma
  • EgEa
  • p1his(k1)1
  • else
  • p1 exp( -(Ea-Eg)/T )
  • p2 random('unif',0,1)
  • if( p1 gt p2 )
  • mgma
  • EgEa
  • end

25
Part 2Bootstrap Method
26
theory of confidence intervals
  • error is the data
  • result in
  • errors in the estimated model parameters

m(d)
27
theory of confidence intervals
  • error is the data
  • result in
  • errors in the estimated model parameters

m(d)




95 confidence
28
Gaussian linear theoryd Gmm G-gdstandard
error propagationcov mG-g cov d G-gT
univariate Gaussian distribution has95 of
error within two s of its mean
29
What to do withGaussian nonlinear theory?One
possibilitylinearize theory and use standard
error propagationd g(m) m-m(p) G(p) g d-
g(m(p)) cov mG(p)-g cov d G(p)-g
30
disadvantagesunknown accuracyandneed to
compute gradient of theory G(p) G(p) not
computed when using some solution methods
31
alternativeconfidence intervals withrepeat
datasets
do the whole experiment many times use results
of each experiment to make compute mest create
histograms from many mests derive empirical 95
confidence intervals from histograms
32
Bootstrap Method
create approximate repeat datasets by randomly
resampling (with duplications) the one existing
data set
33
example of resampling
original data set
random integers in range 1-6
resampled data set
1.4 2.1 3.8 3.1 1.5 1.7
1 2 3 4 5 6
3 1 3 2 5 1
3.8 1.4 3.8 2.1 1.5 1.4
1 2 3 4 5 6
34
example of resampling
original data set
random integers in range 1-6
new data set
1.4 2.1 3.8 3.1 1.5 1.7
1 2 3 4 5 6
3 1 3 2 5 1
3.8 1.4 3.8 2.1 1.5 1.4
1 2 3 4 5 6
35
example of resampling
original data set
random integers in range 1-6
resampled data set
1.4 2.1 3.8 3.1 1.5 1.7
1 2 3 4 5 6
3 1 3 2 5 1
3.8 1.4 3.8 2.1 1.5 1.4
1 2 3 4 5 6
note repeats
36
  • rowindex unidrnd(N,N,1)
  • xresampled x( rowindex )
  • dresampled dobs( rowindex )

37
interpretation of resampling
mixing
sampling
duplication
p(d)
p(d)
38
(No Transcript)
39
  • Nbins50
  • m1hminmin(m1save)
  • m1hmaxmax(m1save)
  • Dm1bins (m1hmax-m1hmin)/(Nbins-1)
  • m1binsm1hminDm1bins0Nbins-1'
  • m1hist hist(m1save,m1bins)
  • pm1 m1hist/(Dm1binssum(m1hist))
  • Pm1 Dm1binscumsum(pm1)
  • m1lowm1bins(find(Pm1gt0.025,1))
  • m1highm1bins(find(Pm1gt0.975,1))

40
  • Nbins50
  • m1hminmin(m1save)
  • m1hmaxmax(m1save)
  • Dm1bins (m1hmax-m1hmin)/(Nbins-1)
  • m1binsm1hminDm1bins0Nbins-1'
  • m1hist hist(m1save,m1bins)
  • pm1 m1hist/(Dm1binssum(m1hist))
  • Pm1 Dm1binscumsum(pm1)
  • m1lowm1bins(find(Pm1gt0.025,1))
  • m1highm1bins(find(Pm1gt0.975,1))

histogram
41
  • Nbins50
  • m1hminmin(m1save)
  • m1hmaxmax(m1save)
  • Dm1bins (m1hmax-m1hmin)/(Nbins-1)
  • m1binsm1hminDm1bins0Nbins-1'
  • m1hist hist(m1save,m1bins)
  • pm1 m1hist/(Dm1binssum(m1hist))
  • Pm1 Dm1binscumsum(pm1)
  • m1lowm1bins(find(Pm1gt0.025,1))
  • m1highm1bins(find(Pm1gt0.975,1))

empirical p.d.f.
42
  • Nbins50
  • m1hminmin(m1save)
  • m1hmaxmax(m1save)
  • Dm1bins (m1hmax-m1hmin)/(Nbins-1)
  • m1binsm1hminDm1bins0Nbins-1'
  • m1hist hist(m1save,m1bins)
  • pm1 m1hist/(Dm1binssum(m1hist))
  • Pm1 Dm1binscumsum(pm1)
  • m1lowm1bins(find(Pm1gt0.025,1))
  • m1highm1bins(find(Pm1gt0.975,1))

empirical c.d.f.
43
  • Nbins50
  • m1hminmin(m1save)
  • m1hmaxmax(m1save)
  • Dm1bins (m1hmax-m1hmin)/(Nbins-1)
  • m1binsm1hminDm1bins0Nbins-1'
  • m1hist hist(m1save,m1bins)
  • pm1 m1hist/(Dm1binssum(m1hist))
  • Pm1 Dm1binscumsum(pm1)
  • m1lowm1bins(find(Pm1gt0.025,1))
  • m1highm1bins(find(Pm1gt0.975,1))

95 confidence bounds
Write a Comment
User Comments (0)
About PowerShow.com