Sparse Random Linear Codes are Locally Decodable and Testable - PowerPoint PPT Presentation

About This Presentation
Title:

Sparse Random Linear Codes are Locally Decodable and Testable

Description:

[Goldreich, Sudan] The core hardness of PCP. ... E: {0,1}t logn {0,1}n is a linear map such that C = {E(m) | m { 0,1}t logn } is ... – PowerPoint PPT presentation

Number of Views:66
Avg rating:3.0/5.0
Slides: 17
Provided by: omer8
Category:

less

Transcript and Presenter's Notes

Title: Sparse Random Linear Codes are Locally Decodable and Testable


1
Sparse Random Linear Codes are Locally Decodable
and Testable
Tali Kaufman (MIT) Joint work with
Madhu Sudan (MIT)
2
Error-Correcting Codes
  • Code C ? 0,1n - collection of vectors
    (codewords) of length n.
  • Linear Code - codewords form a linear subspace
  • Codeword weight For c ? C , w(c) is non-zeros
    in c.
  • C is
  • nt sparse if C nt
  • n-? biased if n/2 n1-? ? w(c) ? n/2 n1-?
    (for every c ? C )
  • distance d if for every c ? C w(c) ? d

3
Local Testing / Correcting / Decoding
Given C ? 0,1n , vector v, make k queries into
v k - local testing - decide if v is in C or
far from every c ? C. k - local correcting - if
v is close to c ? C, recover c(i) w.h.p. k -
local decoding - if v is close to c ? C, and c
encodes a message m , recover m(i) w.h.p. C
E(m) m ? 0,1s , E 0,1s ? 0,1n , s
lt n Example Hadamard Code, Linear functions.
a ?0,1logn, f(x) ? ai xi (k3) - testing
f(x)f(y)f(xy) 0 ? For random x,y. (k2) -
correcting correct f(x) by f(xy) f(y) for a
random y. (k2) - decoding recover a(i) by
f(ei y) f(y) for a random y.
4
Brief History
Local Correction Blum, Luby, Rubinfeld In the
context of Program Checking. Local Testability
Blum,Luby,Rubinfeld Rubinfeld,
Sudan, Goldreich, Sudan The core hardness of
PCP. Local Decoding Katz, Trevisan,
Yekhanin In the context of Private Information
Retrieval (PIR) schemes. Most previous results
(apart from K, Litsyn ) focus on specific codes
obtained by their nice algebraic structures.
This work results for general codes based
only on their density and distance.
5
Our Results
Theorem (local-correction) For every t, ? gt 0
const, If C ? 0,1n is nt sparse and n-? biased
then it is kk(t, ? ) local corrected. Corollar
y (local-decoding) For every t, ? gt 0 const, If
E 0,1t logn ? 0,1n is a linear map such that
C E(m) m? 0,1t logn is nt sparse and
n-? biased then E is kk(t, ? ) local
decoded. Proof CE (m,E(m)) m ? 0,1t logn
is k local corrected. Theorem
(local-testing) For every t, ? gt 0 const, If C
? 0,1n is nt sparse with distance n/2 n1-?
then it is kk(t, ? ) local tested .
  • Recall, C is
  • nt sparse if C nt
  • n-? biased if n/2 n1-? ? w(c) ? n/2 n1-?
    (for every c ? C )
  • distance d if for every c ? C w(c) ? d

6
Corollaries
Reproduce testability of Hadamard, dual-BCH
codes. Random code - A random code C ? 0,1n
obtained by the linear span of a random t logn
n matrix is nt sparse and O(logn/vn) biased,
i.e. it is k ?(t) local corrected, local decoded
and local tested. Can not get denser random
code Similar random code obtained by a random
(logn)2 n matrx doesnt
have such properties. There are linear subspaces
of high degree polynomials that are sparse and
un-biased so we can local correct, decode and
test them. Example Tr(ax2logn/41
bx2logn/81 ) a,b ? F_2logn Nice closure
properties Subcodes, Addition of new
coordinates, removal of few coordinates.
7
Main Idea
  • Study weight distribution of dual code and some
    related codes.
  • Weight distribution ?
  • Dual code ?
  • Which related codes?
  • How? MacWilliams Identities Johnson bounds

8
Weight Distribution, Duals
  • Weight distribution (B0C,,BnC)
  • BkC - of codewords of weight k in the code
    C. 0? k ? n
  • Dual Code
  • C - ? 0,1n - vectors orthogonal to all
    codewords in C ? 0,1n.
  • Codeword v ? C iff v - C - for every c
    ? C -, lt v, c gt 0.

9
Which Related Codes?
  • Local-Correction Duals of C, C- i, C- i j
  • Local-Decoding Same applied to C.
  • C (m,E(m)). E(m) 0,1s ? 0,1n , s
    lt n
  • Local-Testing Duals of C, and of C ? v

10
Duals of Sparse Unbiased Codes have Many k-Weight
Codewords
C is nt sparse and n-? biased. BkC- ?
MacWilliams Transform BkC- ? BiC Pk(i) /
C
BkC- ? Pk (0) n(1-?) k n t /C If k
? ? ( t / ?)
BkC- Pk (0)/C
11
Canonical k-Tester
Goal Decide if v is in C or far from every c ? C.
Tester Pick a random c ? C - k lt v, c gt 0
accept else reject
Total number of possible tests C - k
BkC- For v?C bad tests C ? v - k BkC ?
v- Works if number of bad tests is bounded.
12
Proof of Local Testing Theorem (un-biased)
Reduces to show (Gap) for v at distance ? from
C BkC ? v- ? (1- ? ) BkC- Using Macwilliams
and the estimation BkC- ½ ? BiC ? v Pk(i) ?
(1- ? ) Pk (0)
Good ? loss
C ? v C ? C v
Bad ? gain
C is nt sparse and n-? biased
13
Canonical k-Corrector
Goal Given v is ?-close to c ? C, recover c(i)
w.h.p.
Corrector Pick a random c ? C - k,i
k-weight words w. 1 in ith coordinate. Return ?
s ? 1c i vs 1c
i ci 1
A random location in v is corrupted w.p ?. If for
every i , every other coordinate j that the
corrector considers is random then probability
of error lt ? k
14
Proof of Self Correction Theorem
Reduces to show (2-wise independence property in
C - k ) For every i,j C - k , i,j / C
- k , i ? k/n (as if the code is random) C
- k , i,( C - k , i,j ) k-weight codewords of
C- with 1 in i, (i j) coordinates.
C - k , i C - k - C- i - k C - k
, i,j C - k - C- i - k - C- j - k
C- i j - k All involved codes are sparse
and unbiased
15
Open Issues
Local Correction based on distance. Obtain
general k-local correction, local-decoding local
testing results for denser codes. Which denser
codes?
16
Thank You!!!
Write a Comment
User Comments (0)
About PowerShow.com