Simple PCPs - PowerPoint PPT Presentation

About This Presentation
Title:

Simple PCPs

Description:

Error-Correcting Codes: Progress & Challenges Madhu Sudan MIT CSAIL Communication in presence of noise Shannon s Model: Probabilistic Noise Hamming Model: Worst ... – PowerPoint PPT presentation

Number of Views:95
Avg rating:3.0/5.0
Slides: 40
Provided by: Madh77
Category:
Tags: decoding | pcps | simple

less

Transcript and Presenter's Notes

Title: Simple PCPs


1
Error-Correcting Codes Progress Challenges
Madhu Sudan MIT CSAIL
2
Communication in presence of noise
We are now ready
We are not ready
Noisy Channel
Sender
Receiver
If information is digital, reliability is critical
3
Shannons Model Probabilistic Noise
Sender
Receiver
Encode (expand)
Decode (compress?)
Noisy Channel
Probabilistic Noise E.g., every letter flipped
to random other letter of w.p. p Focus
Design good Encode/Decode algorithms.
4
Hamming Model Worst-case error
  • Errors Upto worst-case errors
  • Focus Code
  • (Note Not encoding/decoding)
  • Goal Design code so as to correct any pattern of
  • errors.

5
Problems in Coding Theory, Broadly
  • Combinatorics Design best possible
    error-correcting codes.
  • Probability/Algorithms Design algorithms
    correcting random/worst-case errors.

6
Part I (of III) Combinatorial Results
7
Hamming Notions
  • Hamming Distance
  • Distance of Code
  • Main question
  • Asymptotically

8
Simple results
  • Ball
  • Volume of Ball
  • Entropy function
  • Hamming (Packing) Bound
  • (No code can have too many codewords)

9
Simple results (contd.)
  • Gilbert-Varshamov (Greedy) Bound

10
Simple results (Summary)
  • For the best code
  • After fifty years of research We still dont
    know.

Which is right?
11
Binary case
  • Case of large distance
  • Case of small (relative) distance
  • Case of constant distance

BCH
12
Binary case (Closer look)
  • For general
  • Can we do better? Twice as many codewords?
  • (wont change asymptotics of )
  • Recent progress Jiang-Vardy

13
Proof idea of Jiang-Vardy

14
Major questions in binary codes
  • Give explicit construction meeting GV bound.
  • Is Hamming tight when
  • Is LP Bound tight?

15
Combinatorics (contd.) q-ary case
  • Fix
  • Surprising result (80s)
  • (Also a negative surprise BCH codes only yield

  • )

Plotkin
GV bound
Not Hamming
16
Major questions q-ary case
17
Part II (of III) Correcting Random Errors
18
Recall Shannon
19
Constructive versions
20
What is satisfaction?
  • Articulated by Luby,Mitzenmacher,Shokrollahi,Spie
    lman 96

21
Current state of the art
  • Luby et al. Propose study of codes based on
    irregular graphs (Irregular LDPC Codes).

22
LDPC Codes
23
LDPC Codes
24
Current state of the art
  • Luby et al. Propose study of codes based on
    irregular graphs (Irregular LDPC Codes).
  • No theorems so far for erroneous channels.
  • Strong analysis for (much) simpler case of
    erasure channels (symbols are erased) decoding
    time
  • (Easy to get composition based algorithms
    with
  • decoding time )
  • Do have some proposals for errors as well (with
    analysis by Luby et al., Richardson Urbanke),
    but none known to converge to Shannon limit.

25
Still open
  • Articulated by Luby,Mitzenmacher,Shokrollahi,Spie
    lman 96

26
Part III Correcting Adversarial Errors
27
Motivation
  • As notions of communication/storage get more
    complex, modeling error as oblivious (to
    message/encoding/decoding) may be too simplistic.
  • Need more general models of error
    encoding/decoding for such models.
  • Most pessimistic model errors are worst-case.

28
Gap between worst-case random errors
  • In Shannon model, with binary channel
  • Can correct upto 50 (random) errors.
  • In Hamming model, for binary channel
  • Code with more than n codewords has distance at
    most 50.
  • So it corrects at most 25 worst-case errors.
  • Need new approaches to bridge gap.

29
Approach List-decoding
  • Main reason for gap between Shannon Hamming
    The insistence on uniquely recovering message.
  • List-decoding Relaxed notion of recovery from
    error. Decoder produces small list (of L)
    codewords, such that it includes message.
  • Code is (p,L) list-decodable if it corrects p
    fraction error with lists of size L.

30
List-decoding
  • Main reason for gap between Shannon Hamming
    The insistence on uniquely recovering message.
  • List-decoding Elias 57, Wozencraft 58
    Relaxed notion of recovery from error. Decoder
    produces small list (of L) codewords, such that
    it includes message.
  • Code is (p,L) list-decodable if it corrects p
    fraction error with lists of size L.

31
What to do with list?
  • Probabilistic error List has size one w.p.
    nearly 1
  • General channel Need side information of only
    O(log n) bits to disambiguate Guruswami 03
  • (Altly if sender and receiver share O(log n)
    bits, then they can disambiguate Langberg 04).
  • Computationally bounded error
  • Model introduced by Lipton, Ding Gopalan L.
  • List-decoding results can be extended (assuming
    PKI and some memory at sender) Micali et al.

32
List-decoding State of the art
  • Zyablov-Pinsker/Blinovskii late 80s
  • Matches Shannons converse perfectly! (So cant
    do better even for random error!)
  • But ZP/B non-constructive!

33
Algorithms for List-decoding
  • Not examined till 88.
  • First results Goldreich-Levin for Hadamard
    codes (non-trivial in their setting).
  • More recent work
  • S.96, Shokrollahi-Wasserman 98,
    Guruswami-S.99, Parvaresh-Vardy 05,
    Guruswami-Rudra 06 Decode algebraic codes.
  • Guruswami-Indyk 00-02 Decode
    graph-theoretic codes.
  • TaShma-Zuckerman 02, Trevisan 03 Propose
    new codes for list-decoding.

34
Results in List-decoding
  • Q-ary case
  • Binary case

35
Few lines about Guruswami-Rudra
  • Code Collated Reed-Solomon Code Concatenation.

36
Few lines about Guruswami-Rudra
  • Special properties
  • Is this code combinatorially good?
  • Algorithmically good!! (uses ideas from
    S96,GS98,PV05 new ones.
  • Can concatenate to reduce alphabet size.

37
Few lines about Guruswami-Rudra
  • Warnings K, N, partition all very special.

Encoding \\ \indent First partition \F_q into
\red special sets S_0,S_1,\ldots,S_N,
\\ \indent \indent with S_1 \cdots S_N
C. \\ \indent Say S_1 \\alpha_1,\ldots,\alpha
_C\, S_2 \\alpha_C1,\ldots,\alpha_2C\
etc.\\ \indent Encoding of P\\ \indent \indent
\langle \langle P(x_1),\ldots,P(x_C)
\rangle, \langle P(x_C1),\ldots,P(x_2C)
\rangle \cdots \rangle
38
Major open question
39
Conclusions
  • Many mysteries in combinatorial setting.
  • Significant progress in algorithmic setting, but
    many important open questions as well.

40
LDPC Codes
41
LDPC Codes
Write a Comment
User Comments (0)
About PowerShow.com