Title: Nonlinear%20Approximation%20Based%20Image%20Recovery%20Using%20Adaptive%20Sparse%20Reconstructions
1Nonlinear Approximation Based Image Recovery
Using Adaptive Sparse Reconstructions
- Onur G. Guleryuz
- oguleryuz_at_erd.epson.com
-
- Epson Palo Alto Laboratory
- Palo Alto, CA
(Full screen mode recommended. Please see
movies.zip file for some movies, or email
me. Audio of the presentation will be uploaded
soon.)
2Overview
- Difficulties with nonstationary statistics.
- Five examples and movies to discuss transform
properties.
- Many more (20) simulation examples, movies, etc.
Please stay after questions.
Working paper http//eeweb.poly.edu/onur/online_
pub.html (google onur guleryuz)
3Problem Statement
Use surrounding spatial information to recover
lost block via adaptive sparse reconstructions.
Image
Lost Block
Applications Error concealment, damaged images,
...
Generalizations Irregularly shaped blocks,
partial information, ...
4Notation Transforms
transform coefficient (scalar)
Assume orthonormal transforms
5Notation Approximation
Nonlinear Approximation Based Image
Linear approximation
apriori ordering
Nonlinear approximation
signal dependent ordering
6Notation Sparse
sparse classes for linear approximation
sparse classes for nonlinear approximation
Linear app
Nonlinear approximation
7Main Idea
1.
Original
Image
2.
Lost Block
3.
Predicted
- Given Tgt0 and the observed signal
8Sparse Classes
x
x
Pixel coordinates for a two pixel image
Transform coordinates
Linear app
Nonlinear app
2T
class(K,T)
or
convex set
convex set
non-convex, star-shaped set
Rolf Schneider , Convex Bodies The
Brunn-Minkowski Theory, Cambridge University
Press, March 2003.
Onur G. Guleryuz, E. Lutwak, D. Yang, and G.
Zhang, Information-Theoretic Inequalities for
Contoured Probability Distributions,' IEEE
Transactions on Information Theory, vol. 48, no.
8, pp. 2377-2383, August 2002.
9Examples
1. Interested in edges, textures, , and
combinations
(not handled well in literature)
9.37 dB
8.02 dB
11.10 dB
3.65 dB
2. MSE improving.
3. Image prediction has a tough audience!
10Difficulties with Nonstationary Data
- Estimation is a well studied topic, need to infer
statistics, then build estimators. - With nonstationary data inferring statistics is
very difficult.
Higher order method, better edge detection?
11Important Properties
- This technique does not know anything about
images.
No non-robust edge detection, segmentation,
training, learning, etc., required.
- Applicable for general nonstationary signals.
Use it on speech, audio, seismic data,
- Just pick a transform that provides sparse
decompositions using nonlinear approximation, the
rest is automated.
(DCTs, wavelets, complex wavelets, etc.)
12Main Algorithm
orthonormal linear transformation.
linear transform of y ( ).
- Start with an initial value.
- Threshold coefficients to determine V(x,T)
sparsity constraint
(equations or iterations)
- Reduce threshold (found solution becomes initial
value).
13Progression of Solutions
Nonlinear app class(K,T)
missing pixel
available pixel constraint
x
available pixel
non-convex, star-shaped set
Pixel coordinates for a two pixel image
Search over
Search over
Class size increases
T decreases
Search over
14Estimation Theory
Sparsity Constraint Linear Estimation
Proposition 1 Solution of subject to
sparsity constraint results in the linear estimate
Proposition 2 Conversely suppose that we start
with a linear estimate for via
restricted to dimensional subspace
sparsity constraints
15Required Statistics?
None. The statistics required in the estimation
are implicitly determined by the utilized
transform and V(x).
(V(x) is the index set of insignificant
coefficients)
I will fix G and adaptively determine V(x). (By
hard-thresholding transform coefficients)
16Apriori v.s. Adaptive
Method 1
optimality?
Can at best be ensemble optimal for second order
statistics.
Do not capture nonstationary signals with edges.
Method 2
Can at best be THE optimal!
J.P. D'Ales and A. Cohen, Non-linear
Approximation of Random Functions, Siam J. of A.
Math 57-2, 518-540, 1997
Albert Cohen, Ingrid Daubechies, Onur G.
Guleryuz, and Michael T. Orchard, On the
importance of combining wavelet-based nonlinear
approximation with coding strategies, IEEE
Transactions on Information Theory, July 2002.
17Conclusion
- Simple, robust technique.
- Very good and promising performance.
- Estimation of statistics not required (have to
pick G though). - Applicable to other domains.
- Q Classes of signals over which optimal? A
Nonlinear approximation classes of the transform. - Signal dependent basis to expand classes over
which optimal. - Help design better signal representations.
(intuitive)
18Periodic Example
11.10 dB
PSNR
DCT 9x9
Lower thresholds, larger classes.
19Properties of Desired Transforms
Want lots of small coefficients wherever they may
be
- Periodic, approximately periodic regions
Transform should see the period
Example Minimum period 8 at least
8x8 DCT ( 3 level wavelet packets).
s(n)
S(w)
M
-M
zeroes
20Periodic Example
(period8)
Perf. Rec.
DCT 8x8
(Easy base signal, fast decaying envelope).
21Periodic Example
5.91 dB
DCT 24x24
(Harder base signal.)
22Edge Example
25.51 dB
DCT 8x8
( Separable, small DCT coefficients except for
first row.)
23Edge Example
9.18 dB
DCT 24x24
(similar to vertical edge, but tilted)
24Properties of Desired Transforms
- Periodic, approximately periodic regions
Frequency selectivity
Transform should have the frequency selectivity
to see the slope of the edge.
25Overcomplete Transforms
DCT block over an edge (not very sparse)
DCT block over a smooth region (sparse)
DCT1
edge
smooth
smooth
Only the insignificant coefficients contribute.
Can be generalized to denoising
Onur G. Guleryuz, Weighted Overcomplete
Denoising, Proc. Asilomar Conference on Signals
and Systems, Pacific Grove, CA, Nov. 2003.
26Properties of Desired Transforms
Nonlinear Approximation does not work for
non-localized Fourier transforms.
!
- Frequency selectivity for periodic edge
regions.
(Overcomplete DCTs have more mileage since for a
given freq. selectivity, have the smallest
spatial support.)
J.P. D'Ales and A. Cohen, Non-linear
Approximation of Random Functions, Siam J. of A.
Math 57-2, 518-540, 1997
27Periodic Example
3.65 dB
DCT 16x16
28Periodic Example
7.2 dB
DCT 16x16
29Periodic Example
10.97 dB
DCT 24x24
30Edge Example
12.22 dB
DCT 16x16
31Edge Example
4.04 dB
DCT 24x24
32Combination Example
9.26 dB
DCT 24x24
33Combination Example
8.01 dB
DCT 16x16
34Combination Example
6.73 dB
DCT 24x24
(not enough to see the period)
35Unsuccessful Recovery Example
-1.00 dB
DCT 16x16
36 Partially Successful Recovery Example
4.11 dB
DCT 16x16
37Combination Example
3.77 dB
DCT 24x24
38Periodic Example
3.22 dB
DCT 32x32
39Edge Example
14.14 dB
DCT 16x16
40Edge Example
0.77 dB
DCT 24x24
41Robustness
remains the same but changes.
42 Determination
- Start by layering the lost block. Estimate layer
at a time.
(the lost block is potentially large)
Recover layer P by using information from layers
0,,P-1
43 Determination II
- Fix T. Look at DCTs that have limited spatial
overlap with missing data. - Establish sparsity constraints by thresholding
these DCT coefficients with T.
(If c(i)ltT add to sparsity constraints.)
Image
Outer border of layer 1
Lost block