Total variation minimization Numerical Analysis, Error Estimation, and Extensions - PowerPoint PPT Presentation

1 / 45
About This Presentation
Title:

Total variation minimization Numerical Analysis, Error Estimation, and Extensions

Description:

Many related approaches appeared in the last years, e.g. l 1 penalization / sparsity techniques ... Optimal choice of the penalization parameter. i.e. of the ... – PowerPoint PPT presentation

Number of Views:607
Avg rating:3.0/5.0
Slides: 46
Provided by: martin181
Category:

less

Transcript and Presenter's Notes

Title: Total variation minimization Numerical Analysis, Error Estimation, and Extensions


1
Total variation minimization Numerical Analysis,
Error Estimation, and Extensions
Westfälische Wilhelms Universität Münster
  • Martin Burger

Johannes Kepler University Linz SFB
Numerical-Symbolic-Geometric Scientific
Computing Radon Institute for Computational
Applied Mathematics
2
Collaborations
  • Stan Osher, Jinjun Xu, Guy Gilboa (UCLA)
  • Lin He (Linz / UCLA)
  • Klaus Frick, Otmar Scherzer (Innsbruck)
  • Carola Schönlieb (Vienna)
  • Don Goldfarb, Wotao Yin (Columbia)

3
Introduction
  • Total variation methods are popular in imaging
    (and inverse problems), since
  • they keep sharp edges
  • eliminate oscillations (noise)
  • create new nice mathematics
  • Many related approaches appeared in the last
    years, e.g. l 1 penalization / sparsity techniques

4
Introduction
  • Total variation and related methods have some
    shortcomings
  • difficult to analyze and to obtain error
    estimates
  • systematic errors (clean images not
    reconstructed perfectly)
  • computational challenges
  • some extensions to other imaging tasks are not
    well understood (e.g. inpainting)

5
ROF Model
  • Starting point of the analysis is the ROF
    model for denoising
  • Rudin-Osher Fatemi 89/92, Acar-Vogel 93,
    Chambolle-Lions 96, Vogel 95/96,
    Scherzer-Dobson 96, Chavent-Kunisch 98,
    Meyer 01,

6
ROF Model
Reconstruction (code by Jinjun Xu)
clean noisy ROF
7
Error Estimation
  • First question for error estimation estimate
    difference of u (minimizer of ROF) and f in terms
    of l
  • Estimate in the L2 norm is standard, but does
    not yield information about edges
  • Estimate in the BV-norm too ambitious even
    arbitrarily small difference in edge location can
    yield BV-norm of order one !

8
Error Estimation
  • We need a better error measure, stronger than
    L2, weaker than BV
  • Possible choice Bregman distance Bregman 67
  • Real distance for a strictly convex
    differentiable functional not symmetric
  • Symmetric version

9
Error Estimation
  • Total variation is neither symmetric nor
    differentiable
  • Define generalized Bregman distance for each
    subgradient
  • Symmetric version
  • Kiwiel 97, Chen-Teboulle 97

10
Error Estimation
  • Since TV seminorm is homogeneous of degree one,
    we have
  • Bregman distance becomes

11
Error Estimation
  • Bregman distance for TV is not a strict
    distance, can be zero for
  • In particular dTV is zero for contrast change
  • Resmerita-Scherzer 06
  • Bregman distance is still not negative (TV
    convex)
  • Bregman distance can provide information about
    edges

12
Error Estimation
  • Let v be piecewise constant with white
    background and color values on regions
  • Then we obtain subgradients of the form
  • with signed distance function and

13
Error Estimation
  • Bregman distances given by
  • In the limit we obtain for being piecewise
    continuous

14
Error Estimation
  • For estimate in terms of l we need smoothness
    condition on data
  • Optimality condition for ROF

15
Error Estimation
  • Subtract q
  • Estimate for Bregman distance, mb-Osher 04

16
Error Estimation
  • In practice we have to deal with noisy data f
    (perturbation of some exact data g)
  • Estimate for Bregman distance

17
Error Estimation
  • Optimal choice of the penalization parameter
  • i.e. of the order of the noise variance

18
Error Estimation
  • Direct extension to deconvolution / linear
    inverse problems
  • under standard source condition
  • mb-Osher 04
  • Extension stronger estimates under stronger
    conditions, Resmerita 05
  • Nonlinear inverse problems, Resmerita-Scherzer 06

19
Discretization
  • Natural choice primal discretization with
    piecewise constant functions on grid
  • Problem 1 Numerical analysis (characterization
    of discrete subgradients)
  • Problem 2 Discrete problems are the same for
    any anisotropic version of the total variation

20
Discretization
  • In multiple dimensions, nonconvergence of the
    primal discretization for the isotropic TV (p2)
    can be shown
  • Convergence of anisotropic TV (p1) on
    rectangular aligned grids
  • Fitzpatrick-Keeling 1997

21
Primal-Dual Discretization
  • Alternative perform primal-dual discretization
    for optimality system (variational
    inequality)with convex set

22
Primal-Dual Discretization
  • Discretization
  • Discretized convex set with appropriate elements
    (piecewise linear in 1D, Raviart-Thomas in
    multi-D)

23
Primal / Primal-Dual Discretization
  • In 1 D primal, primal-dual, and dual
    discretization are equivalent
  • Error estimate for Bregman distance by analogous
    techniques
  • Note that only the natural condition

    is needed to show

24
Primal / Primal-Dual Discretization
  • In multi-D similar estimates, additional work
    since projection of subgradient is not discrete
    subgradient.
  • Primal-dual discretization equivalent to
    discretized dual minimization (Chambolle 03,
    Kunisch-Hintermüller 04). Can be used for
    existence of discrete solution, stability of p
  • mb 06/07 ?

25
Cartesian Grids
  • For most imaging applications Cartesian grids
    are used. Primal dual discretization can be
    reinterpreted as a finite difference scheme in
    this setup.
  • Value of image intensity corresponds to color in
    a pixel of width h around the grid point.
  • Raviart-Thomas elements on Cartesian grids
    particularly easy. First component piecewise
    linear in x, pw constant in y,z, etc.
  • Leads to simple finite difference scheme with
    staggered grid

26
Extension I Iterative Refinement ISS
  • ROF minimization has a systematic error, total
    variation of the reconstruction is smaller than
    total variation of clean image. Image features
    left in residual f-ug, clean f, noisy u,
    ROF f-u

27
Extension I Iterative Refinement ISS
  • Idea add the residual (noise) back to the
    image to pronounce the features decreased to
    much. Then do ROF again. Iterative procedure
  • Osher-mb-Goldfarb-Xu-Yin 04

28
Extension I Iterative Refinement ISS
  • Improves reconstructions significantly

29
Extension I Iterative Refinement ISS
30
Extension I Iterative Refinement ISS
  • Simple observation from optimality condition
  • Consequently, iterative refinement equivalent to
    Bregman iteration

31
Extension I Iterative Refinement ISS
  • Choice of parameter l less important, can be
    kept small (oversmoothing). Regularizing effect
    comes from appropriate stopping.
  • Quantitative stopping rules available, or stop
    when you are happy S.O.
  • Limit l to zero can be studied. Yields gradient
    flow for the dual variable (inverse scale
    space)mb-Gilboa-Osher-Xu 06,
    mb-Frick-Osher-Scherzer 06

32
Extension I Iterative Refinement ISS
  • Non-quadratic fidelity is possible, some caution
    needed for L1 fidelity
  • He-mb-Osher 05, mb-Frick-Osher-Scherzer 06
  • Error estimation in Bregman distance
    mb-Resmerita 06, in prep
  • Further details see talk of Klaus Frick

33
Extension I Inverse Scale Space
  • Movie by M. Bachmayr, Master Thesis 06

34
Extension I Iterative Refinement ISS
  • Application to other regularization techniques,
    e.g. wavelet thresholding is straightforward
  • Starting from soft shrinkage, iterated
    refinement yields firm shrinkage, inverse scale
    space becomes hard shrinkageOsher-Xu 06
  • Bregman distance natural sparsity measure,
    source condition just requires sparse signal,
    number of nonzero components is smoothness
    measure in error estimates

35
Extension I Iterative Refinement ISS
  • Total variation, inverse scale space, and
    shrinkage techniques can be combined nicely
  • See talk by Lin He

36
Extension II Anisotropy
  • Total variation will prefer isotropic structures
    (circles, spheres) or special anisotropies
  • In many applications one wants sharp corners in
    different directions. Adaptive anisotropy is
    needed
  • Can be incorporated in ROF and ISS. See talk by
    Benjamin Berkels

37
Extension III Inpainting
  • Difficult to construct total variation
    techniques for inpainting
  • Original extensions of ROF failed to obtain
    natural connectivity (see book by Chan, Shen 05)
  • Inpainting region , image f (noisy) given on
  • Try to minimize

38
Extension III Inpainting
  • Optimality condition will have the form
  • with A being a linear operator defining the
    norm
  • In particular p 0 in D !

39
Extension III Inpainting
  • Different iterated approach (motivated by
    Cahn-Hilliard inpainting, Bertozzi et al 05)
  • Minimize in each step
  • First term for damping, second for fidelity (fit
    to f where given, and to old iterate in the
    inpainting region), third term for smoothing

40
Extension III Inpainting
  • Continuous flow for damping parameter to zero
  • Fourth order flow for H-1 norm
  • Stationary solution (existence ?) satisfies

41
Extension III Inpainting
  • Result Penguins

42
Extension IV Manifolds
  • Original motivation Osher-Marquinha 01 used
    preconditioned gradient flow for ROF
  • Stationary state assumed to be ROF minimizer
  • Computational observation not always true !
  • Trivial observation for initial value u(0) 0
    the flow remains zero for all time !

43
Extension IV Manifolds
  • Embarrassing observation flow always created by
    transport from initial value
  • Important observation Stationary state
    minimizes ROF on the manifold

44
Extension IV Manifolds
  • Surprising observation for f being the
    indicator function of a convex set, the flow is
    equivalent to the gradient flow of the L1 version
    of ROF
  • No loss of contrast !
  • More detailed analysis for general images needed
  • Possible extension to ROF minimization on other
    manifolds by metric gradient flows

45
Download and Contact
  • Papers and Talks
  • www.indmath.uni-linz.ac.at/people/burger
  • from October wwwmath1.uni-muenster.de/num
  • e-mail martin.burger_at_jku.at
Write a Comment
User Comments (0)
About PowerShow.com