Multiscale Geometric Analysis - PowerPoint PPT Presentation

1 / 20
About This Presentation
Title:

Multiscale Geometric Analysis

Description:

Title: Multiscale Geometric Analysis Author: Richard Baraniuk Last modified by: Dror Baron Created Date: 1/21/2000 4:42:14 AM Document presentation format – PowerPoint PPT presentation

Number of Views:73
Avg rating:3.0/5.0
Slides: 21
Provided by: Richard2144
Category:

less

Transcript and Presenter's Notes

Title: Multiscale Geometric Analysis


1
A Single-letter Characterization of Optimal Noisy
Compressed Sensing
Dongning Guo Dror Baron Shlomo Shamai
2
Setting
  • Replace samples by more general measurements
    based on a few linear projections (inner products)

sparsesignal
measurements
non-zeros
3
Signal Model
  • Signal entry Xn BnUn
  • iid Bn Bernoulli(?) ? sparse
  • iid Un PU

PX
Bernoulli(?)
Multiplier
PU
4
Measurement Noise
  • Measurement process is typically analog
  • Analog systems add noise, non-linearities, etc.
  • Assume Gaussian noise for ease of analysis
  • Can be generalized to non-Gaussian noise

5
Noise Model
  • Noiseless measurements denoted y0
  • Noise
  • Noisy measurements
  • Unit-norm columns ? SNR?

noiseless
SNR
6
Allerton 2006 Sarvotham, Baron, Baraniuk
  • Model process as measurement channel
  • Measurements provide information!

7
Single-Letter Bounds
  • Theorem Sarvotham, Baron, Baraniuk 2006
  • For sparse signal with rate-distortion function
    R(D), lower bound on measurement rate s.t.
    SNR ? and distortion D
  • Numerous single-letter bounds
  • Aeron, Zhao, Saligrama
  • Akcakaya and Tarokh
  • Rangan, Fletcher, Goyal
  • Gastpar Reeves
  • Wang, Wainwright, Ramchandran
  • Tune, Bhaskaran, Hanly

8
Goal Precise Single-letter Characterization of
Optimal CS
9
What Single-letter Characterization?
?
,?
channel
posterior
  • Ultimately what can one say about Xn given Y?
  • (sufficient
    statistic)
  • Very complicated
  • Want a simple characterization of its quality
  • Large-system limit

10
Main Result Single-letter Characterization
  • Result1 Conditioned on Xnxn, the observations
    (Y,?) are statistically equivalent to
  • ? easy to compute
  • Estimation quality from (Y,?) just as good as
    noisier scalar observation

?
,?
channel
posterior
degradation
11
Details
  • ?2(0,1) is fixed point of
  • Take-home point degraded scalar channel
  • Non-rigorous owing to replica method w/ symmetry
    assumption
  • used in CDMA detection Tanaka 2002, Guo Verdu
    2005
  • Related analysis Rangan, Fletcher, Goyal 2009
  • MMSE estimate (not posterior) using Guo Verdu
    2005
  • extended to several CS algorithms particularly
    LASSO

12
Decoupling
13
Decoupling Result
  • Result2 Large system limit any arbitrary
    (constant) L input elements decouple
  • Take-home point interference from each
    individual signal entry vanishes

14
Sparse Measurement Matrices
15
Sparse Measurement Matrices Baron, Sarvotham,
Baraniuk
  • LDPC measurement matrix (sparse)
  • Mostly zeros in ? nonzeros P?
  • Each row contains ¼Nq randomly placed nonzeros
  • Fast matrix-vector multiplication
  • fast encoding / decoding

sparse matrix
16
CS Decoding Using BP Baron, Sarvotham,
Baraniuk
  • Measurement matrix represented by graph
  • Estimate input iteratively
  • Implemented via nonparametric BP
    Bickson,Sommer,

signal x
measurements y
17
Identical Single-letter Characterization w/BP
  • Result3 Conditioned on Xnxn, the observations
    (Y,?) are statistically equivalent to
  • Sparse matrices just as good
  • BP is asymptotically optimal!

identical degradation
18
Decoupling Between Two Input Entries (N500,
M250, ?0.1, ?10)
density
19
CS-BP vs Other CS Methods (N1000, ?0.1, q0.02)
MMSE
CS-BP
M?
20
Conclusion
  • Single-letter characterization of CS
  • Decoupling
  • Sparse matrices just as good
  • Asymptotically optimal CS-BP algorithm
Write a Comment
User Comments (0)
About PowerShow.com