Title: BAE 790I / BMME 231 Fundamentals of Image Processing Class 19
1BAE 790I / BMME 231Fundamentals of Image
ProcessingClass 19
- Statistical Restoration
- Residual Error
- Least-Squares Estimation
- Maximum Likelihood Estimation
2Image Restoration
- Objective To quantitatively estimate the true
image from its degraded measurement.
System H
Restoration process
g
f
n
An estimate of f
3Linear Wiener Filter
- The filter Q gives the minimum MSE for the linear
case. - The only stipulation we made was that n is
zero-mean and uncorrelated with f. - This is the linear version of the Wiener filter.
4Wiener Filter
- Wiener derivation provides a linear system that
gives the minimum mean squared error, assuming - Noise statistics
- Properties of the true image
- System matrix H
- These properties are not all known.
- The MSE depends on the true image and cannot be
computed.
5Recipe Statistical Restoration
- Choose a criterion for determining which solution
is better than another (example MSE). - Express the criterion mathematically
- Solve for the estimate that optimizes the
criterion (example minimizing the MSE)
6Recipe Statistical Restoration
- Choose a criterion for determining which solution
is better than another (example MSE). - Express the criterion mathematically (the
objective function) - Solve for the estimate that optimizes the
criterion (example minimizing the MSE)
7Residual Error
- Consider an error measure that can be computed
- This is the difference between the measured image
and an estimate of the degraded image. - r is an image with dimensions same as g.
8Residual Error
System H
Restoration process
g
f
n
System H
r
-
9Residual Error
- If r is zero, then the estimate f could have
resulted in the g we measured, assuming the model
H is correct. - If true, then f is said to be consistent with g.
10Criterion for a Solution
- Instead of minimizing mean squared error, let us
find a solution that minimizes total residual
squared error (or just residual error) - We can compute this for any f if we have a model
of H. - Find the f that minimizes this.
11Solution
12Least Squares Estimate
- This is called the least-squares (LS) estimate
- Note that it is very different from the MMSE
estimate
13Properties of the LS Estimate
- It depends only on the system matrix, so no
assumptions are made about the statistics of the
noise or the true image. - Bias
It is unbiased.
14Properties of the LS Estimate
- Autocovariance
- If HTH has small eigenvalues, its inverse has
large eigenvalues and noise blows up!
15Recipe Statistical Restoration
- Choose a criterion for determining which solution
is better than another (example MSE). - Express the criterion mathematically
- Solve for the estimate that optimizes the
criterion (example minimizing the MSE)
16Recipe Statistical Restoration
- Choose a criterion for determining which solution
is better than another (example MSE). - Express the criterion mathematically (the
objective function) - Solve for the estimate that optimizes the
criterion (example minimizing the MSE)
17Maximum Likelihood
- Define the likelihood as the probability of g
given f - This is the probability distribution of all
images g given that the input image is f.
18Maximum Likelihood
- We know that g Hf n, so
- We know the first- and second-order statistics of
g.
Assuming n is zero-mean and uncorrelated with f.
19Multivariate Normal PDF
20Maximum Likelihood
- Estimate the likelihood function with a
multivariate Gaussian - This is the relative probability of any image g
given the true image f.
21Maximum Likelihood
- Our criterion will be to find the f that
maximizes the likelihood function. - This is the maximum likelihood (ML) estimate.
- Properties of ML estimators
- Unbiased
- Asymptotically efficient Smallest estimation
error of all unbiased estimators in the limit.
22Recipe Statistical Restoration
- Choose a criterion for determining which solution
is better than another (example MSE). - Express the criterion mathematically (the
objective function) - Solve for the estimate that optimizes the
criterion (example minimizing the MSE)
23Maximum Likelihood
- Instead of maximizing the likelihood function, we
will maximize its logarithm, the log likelihood
function - This reflects the relative probability of the
measured image g given the estimate f.
24Maximum Likelihood Estimate
- Take the derivative with respect to f, and set to
zero to obtain
25Maximum Likelihood Estimate
- Compare the ML estimate with the LS estimate
- They are the same except for the noise term.
- We often refer to the ML estimate with a Gaussian
noise model as the weighted least-squares (WLS)
estimate.
26ML, WLS, and LS Estimates
- The WLS estimate is the same as the ML estimate
for Gaussian-distributed noise. - We have implicitly assumed that noise is
Gaussian-distributed because we only modeled up
to second-order statistics in g. - If n is stationary and uncorrelated, the WLS and
LS solutions are the same. - The weights become important if n is either
nonstationary or correlated.
27ML Estimation
- We can follow the same recipe for other noise
models, but we will get a different estimator for
each. - Poisson
- We may, however, find it hard to solve for some
distributions.