BAE 790I / BMME 231 Fundamentals of Image Processing Class 19 - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

BAE 790I / BMME 231 Fundamentals of Image Processing Class 19

Description:

The filter Q gives the minimum MSE for the linear case. ... If HTH has small eigenvalues, its inverse has large eigenvalues and noise blows up! ... – PowerPoint PPT presentation

Number of Views:65
Avg rating:3.0/5.0
Slides: 28
Provided by: davidl69
Learn more at: http://www.bme.unc.edu
Category:

less

Transcript and Presenter's Notes

Title: BAE 790I / BMME 231 Fundamentals of Image Processing Class 19


1
BAE 790I / BMME 231Fundamentals of Image
ProcessingClass 19
  • Statistical Restoration
  • Residual Error
  • Least-Squares Estimation
  • Maximum Likelihood Estimation

2
Image Restoration
  • Objective To quantitatively estimate the true
    image from its degraded measurement.

System H
Restoration process

g
f
n
An estimate of f
3
Linear Wiener Filter
  • The filter Q gives the minimum MSE for the linear
    case.
  • The only stipulation we made was that n is
    zero-mean and uncorrelated with f.
  • This is the linear version of the Wiener filter.

4
Wiener Filter
  • Wiener derivation provides a linear system that
    gives the minimum mean squared error, assuming
  • Noise statistics
  • Properties of the true image
  • System matrix H
  • These properties are not all known.
  • The MSE depends on the true image and cannot be
    computed.

5
Recipe Statistical Restoration
  • Choose a criterion for determining which solution
    is better than another (example MSE).
  • Express the criterion mathematically
  • Solve for the estimate that optimizes the
    criterion (example minimizing the MSE)

6
Recipe Statistical Restoration
  • Choose a criterion for determining which solution
    is better than another (example MSE).
  • Express the criterion mathematically (the
    objective function)
  • Solve for the estimate that optimizes the
    criterion (example minimizing the MSE)

7
Residual Error
  • Consider an error measure that can be computed
  • This is the difference between the measured image
    and an estimate of the degraded image.
  • r is an image with dimensions same as g.

8
Residual Error
System H
Restoration process

g
f
n
System H

r
-
9
Residual Error
  • If r is zero, then the estimate f could have
    resulted in the g we measured, assuming the model
    H is correct.
  • If true, then f is said to be consistent with g.



10
Criterion for a Solution
  • Instead of minimizing mean squared error, let us
    find a solution that minimizes total residual
    squared error (or just residual error)
  • We can compute this for any f if we have a model
    of H.
  • Find the f that minimizes this.



11
Solution
12
Least Squares Estimate
  • This is called the least-squares (LS) estimate
  • Note that it is very different from the MMSE
    estimate

13
Properties of the LS Estimate
  • It depends only on the system matrix, so no
    assumptions are made about the statistics of the
    noise or the true image.
  • Bias

It is unbiased.
14
Properties of the LS Estimate
  • Autocovariance
  • If HTH has small eigenvalues, its inverse has
    large eigenvalues and noise blows up!

15
Recipe Statistical Restoration
  • Choose a criterion for determining which solution
    is better than another (example MSE).
  • Express the criterion mathematically
  • Solve for the estimate that optimizes the
    criterion (example minimizing the MSE)

16
Recipe Statistical Restoration
  • Choose a criterion for determining which solution
    is better than another (example MSE).
  • Express the criterion mathematically (the
    objective function)
  • Solve for the estimate that optimizes the
    criterion (example minimizing the MSE)

17
Maximum Likelihood
  • Define the likelihood as the probability of g
    given f
  • This is the probability distribution of all
    images g given that the input image is f.

18
Maximum Likelihood
  • We know that g Hf n, so
  • We know the first- and second-order statistics of
    g.

Assuming n is zero-mean and uncorrelated with f.
19
Multivariate Normal PDF
20
Maximum Likelihood
  • Estimate the likelihood function with a
    multivariate Gaussian
  • This is the relative probability of any image g
    given the true image f.

21
Maximum Likelihood
  • Our criterion will be to find the f that
    maximizes the likelihood function.
  • This is the maximum likelihood (ML) estimate.
  • Properties of ML estimators
  • Unbiased
  • Asymptotically efficient Smallest estimation
    error of all unbiased estimators in the limit.

22
Recipe Statistical Restoration
  • Choose a criterion for determining which solution
    is better than another (example MSE).
  • Express the criterion mathematically (the
    objective function)
  • Solve for the estimate that optimizes the
    criterion (example minimizing the MSE)

23
Maximum Likelihood
  • Instead of maximizing the likelihood function, we
    will maximize its logarithm, the log likelihood
    function
  • This reflects the relative probability of the
    measured image g given the estimate f.

24
Maximum Likelihood Estimate
  • Take the derivative with respect to f, and set to
    zero to obtain

25
Maximum Likelihood Estimate
  • Compare the ML estimate with the LS estimate
  • They are the same except for the noise term.
  • We often refer to the ML estimate with a Gaussian
    noise model as the weighted least-squares (WLS)
    estimate.

26
ML, WLS, and LS Estimates
  • The WLS estimate is the same as the ML estimate
    for Gaussian-distributed noise.
  • We have implicitly assumed that noise is
    Gaussian-distributed because we only modeled up
    to second-order statistics in g.
  • If n is stationary and uncorrelated, the WLS and
    LS solutions are the same.
  • The weights become important if n is either
    nonstationary or correlated.

27
ML Estimation
  • We can follow the same recipe for other noise
    models, but we will get a different estimator for
    each.
  • Poisson
  • We may, however, find it hard to solve for some
    distributions.
Write a Comment
User Comments (0)
About PowerShow.com