The uniqueSVP World - PowerPoint PPT Presentation

About This Presentation
Title:

The uniqueSVP World

Description:

And access to wavy/uniform distinguisher. Decision: Is x 1/poly(n) close to H0 or to H ... Ask oracle if z is drawn from wavy or uniform distribution ... – PowerPoint PPT presentation

Number of Views:40
Avg rating:3.0/5.0
Slides: 74
Provided by: ShaiH3
Category:
Tags: uniquesvp | wavy | world

less

Transcript and Presenter's Notes

Title: The uniqueSVP World


1
The unique-SVP World
Shai Halevi, IBM, July 2009
  • Ajtai-Dwork97/07, Regev03
  • PKE from worst-case uSVP
  • Lyubashvsky-Micciancio09
  • Relations between worst-case uSVP, BDD, GapSVP
  • Many slides stolen from Oded Regev, denoted by

?
2
f(n)-unique-SVP
?
  • Promise the shortest vector u is shorter by a
    factor of f(n)
  • Algorithm for 2n-unique SVP LLL82,Schnorr87
  • Believed to be hard for any polynomial nc

2n
nc
1
believed hard
easy
3
Ajtai-Dwork Regev03 PKEs
Nearly-trivial worst-case/average-case reductions
4
n-dimensional distributions
?
  • Distinguish between the distributions

?
Wavy
Uniform
(In a random direction)
5
Dual Lattice
?
  • Given a lattice L, the dual lattice is
  • L x for all y?L, ltx,ygt?Z

1/5
L
L
5
0
0
6
L - the dual of L
?
L
L
?n
0
Case 1
1/n
0
n
?n
Case 2
0
7
Reduction
  • Input a basis B for L
  • Produce a distribution that is
  • Wavy if L has unique shortest vector (u?1/n)
  • Uniform (on P(B)) if l1(L) gt ?n
  • Choose a point from a Gaussian of radius ?n, and
    reduce mod P(B)
  • Conceptually, a random L point with a
    Gaussian(?n) perturbation

8
Creating the Distribution
?
L
L perturb
0
Case 1
n
Case 2
9
Analyzing the Distribution
?
  • Theorem (using Banaszczyk93)
  • The distribution obtained above depends only on
    the points in L of distance ?n from the origin
  • (up to an exponentially small error)
  • Therefore,
  • Case 1 Determined by multiples of u ?
  • wavy on hyperplanes orthogonal to u
  • Case 2 Determined by the origin ?
  • uniform

10
Proof of Theorem
?
  • For a set A in Rn, define
  • Poisson Summation Formula implies
  • Banaszczyks theorem
  • For any lattice L,

11
Proof of Theorem (cont.)
?
  • In Case 2, the distribution obtained is very
    close to uniform
  • Because

12
Ajtai-Dwork Regev03 PKEs
next
13
Distinguish?Search, AD97
  • Reminder L lives in hyperplanes
  • We want to identify u
  • Using an oracle that distinguishes wavy
    distributions from uniform in P(B)

u
H1
H0
H-1
14
The plan
  • Use the oracle to distinguish points close to H0
    from points close to H?1
  • Then grow very long vectors that are rather close
    to H0
  • This gives a very good approximationfor u, then
    we use it to find u exactly

15
Distinguishing H0 from H?1
  • Input basis B for L, length of u, point x
  • And access to wavy/uniform distinguisher
  • Decision Is x 1/poly(n) close to H0 or to H?1?
  • Choose y from a wavy distribution near L
  • y Gaussian(s) with s lt 1/2u
  • Pick a?R0,1, set z ax y mod P(B)
  • Ask oracle if z is drawn from wavy or uniform
    distribution

Gaussian(s) variance s2 in each coordinate
16
Distinguishing H0 from H?1 (cont.)
  • Case 1 x close to H0
  • ax also close to H0
  • ax y mod P(B) close to L, wavy

x
H0
17
Distinguishing H0 from H?1 (cont.)
  • Case 2 x close to H?1
  • ax in the middle between H0 and H?1
  • Nearly uniform component in the u direction
  • ax y mod P(B) nearly uniform in P(B)

x
H1
H0
18
Distinguishing H0 from H?1 (cont.)
  • Repeat poly(n) times, take majority
  • Boost the advantage to near-certainty
  • Below we assume a perfect distinguisher
  • Close to H0 ? always says NO
  • Close to H?1 ? always says YES
  • Otherwise, there are no guarantees
  • Except halting in polynomial time

19
Growing Large Vectors
  • Start from some x0 between H-1 and H1
  • e.g. a random vector of length 1/u
  • In each step, choose xi s.t.
  • xi 2xi-1
  • xi is somewhere between H-1 and H1
  • Keep going for poly(n) steps
  • Result is x between H?1 with xN/u
  • Very large N, e.g., N2n

well see how in a minute
2
20
From xi-1 to xi
  • Try poly(n) many candidates
  • Candidate w 2xi-1 Gaussian(1/u)
  • For j 1,, mpoly(n)
  • wj j/m w
  • Check if wj is near H0 or near H?1
  • If none of the wjs is near H?1 then accept w and
    set xi w
  • Else try another candidate

wwm
w2
w1
21
From xi-1 to xi Analysis
  • xi-1 between H?1 ? w is between H?n
  • Except with exponentially small probability
  • w is NOT between H?1 ? some wj near H?1
  • So w will be rejected
  • So if we make progress, we know that we are on
    the right track

22
From xi-1 to xi Analysis (cont.)
  • With probability 1/poly(n), w is close to H0
  • The component in the u direction is Gaussianwith
    mean lt 2/u and variance 1/u2

noise
2xi-1
H1
H0
23
From xi-1 to xi Analysis (cont.)
  • With probability 1/poly, w is close to H0
  • The component in the u direction is Gaussianwith
    mean lt 2/u and standard deviation 1/u
  • w is close to H0, all wjs are close to H0
  • So w will be accepted
  • After polynomially many candidates, we will make
    progress whp

24
Finding u
  • Find n-1 xs
  • xt1 is chosen orthogonal to x1,,xt
  • By choosing the Gaussians in that subspace
  • Compute u ? x1,,xn-1, with u1
  • u is exponentially close to u/u
  • u/u (ue), e1/N
  • Can make N ? 2n (e.g., N2n )
  • Diophantine approximation to solve for u

2
(slide 71)
25
Ajtai-Dwork Regev03 PKEs
(slide 47)
next
26
Average-case Distinguisher
  • Intuition lattice only matters via the direction
    of u
  • Security parameter n, another parameter N
  • A random u in n-dim. unit sphere defines Du(N)
  • c disceret-Gaussian(N) in one dimension
  • Defines a vector xcu/ltu,ugt, namely x?u and
    ltx,ugtc
  • y Gaussian(N) in the other n-1 dimensions
  • e Gaussian(n-4) in all n dimensions
  • Output xye
  • The average-case problem
  • Distinguish Du(N) from G(N)Gaussian(N)Gaussian(n
    -3)
  • For a noticeable fraction of us

27
Worst-case/average-case (cont.)
  • Thm Distinguishing Du(N) from Uniform ?
    Distinguishing WavyB from UniformB for all B
  • When you know l1(L(B)) upto (11/poly(n))-factor
  • For parameter N 2W(N)
  • Pf Given B, scale it s.t. l1(L(B)) ?
    1,11/poly)
  • Also apply random rotation
  • Given samples x (from UniformB / WavyB)
  • Sample ydiscrete-GaussianB(N)
  • Can do this for large enough N
  • Output zxy
  • Clearly z is close to G(N) /Du(N) respectively

28
The AD97 Cryptosystem
  • Secret key a random u ? unit sphere
  • Public key nm1 vectors (m8n log n)
  • b1,bn? Du(2n), v0,v1,,vm ? Du(n2n)
  • So ltbi,ugt, ltvi,ugt integer
  • We insist on ltv0,ugt odd integer
  • Will use P(b1,bn) for encryption
  • Need P(b1,bn) with width gt 2n/n

29
The AD97 Cryptosystem (cont.)
  • Encryption(s)
  • c ? random-subset-sum(v1,vm) sv0/2
  • output c (cGaussian(n-4)) mod P(B)
  • Decryption(c)
  • If ltu,cgt is closer than ¼ to integer say 0, else
    say 1
  • Correctness due to ltbi,ugt,ltvj,ugtinteger
  • and width of P(B)

30
AD97 Security
  • The bis, vis chosen from Du(something)
  • By hardness assumption, cant distinguish from
    Gu(something)
  • Claim if they were from Gu(something), c would
    have no information on the bit s
  • Proven by leftover hash lemma smoothing
  • Note vis has variance n2 larger than bis
  • ? In the Gu case vi mod P(B) is nearly uniform

31
AD97 Security (cont.)
  • Partition P(B) to qn cells, qn7
  • For each point vi, considerthe cell where it
    lies
  • ri is the corner of that cell
  • SSvi mod P(B) SSri mod P(B) n-5 error
  • S is our random subset
  • SSri mod P(B) is a nearly-random cell
  • Well show this using leftover hash
  • The Gaussian(n-4) in c drowns the error term

q
q
32
Leftover Hashing
  • Consider hash function HR0,1m ? qn
  • The key is Rr1,,rm? qn?m
  • The input is a bit vector bs1,,smT?0,1m
  • HR(b) Rb mod q
  • H is pairwise independent (well, almost..)
  • Yay, lets use the leftover hash lemma
  • ltR,HR(b)gt, ltR,Ugt statistically close
  • For random R? qn?m, b?0,1m, U?qn
  • Assuming m ? n log q

33
AD97 Security (cont.)
  • We proved SSri mod P(B) is nearly-random
  • Recall
  • c0 SSri error(n-5) Gaussian(n-4) mod P(B)
  • For any x and error e, en-5, the distr.
    xeGaussian(n-5), xGaussian(n-4) are
    statistically close
  • So c0 SSri Gaussian(n-3) mod P(B)
  • Which is close to uniform in P(B)
  • Also c1 c0 v0/2 mod P(B) close to uniform

34
Ajtai-Dwork Regev03 PKEs
Worst-case Search u-SVP
Regev03 Hensel lifting
AD97 Geometric
(slide 60)
35
u-SVP vs. BDD vs. GAP-SVP
  • Lyubashevsky-Micciancio, CRYPTO 2009
  • Good old-fashion worst-case reductions
  • Mostly Cook reductions (one Karp reduction)

Worst-case Search u-SVP
Worst-case Search BDD
BDD1/g ? uSVPg/2
uSVPg ? BDD1/g
GapSVPg ? uSVPg
Worst-case Decision GAP-SVP
36
Reminder uSVP and BDD
  • uSVPg g-unique shortest vector problem
  • Input a basis B (b1,,bn)
  • Promise l1(L(B)) lt g l2(L(B))
  • Task find shortest nonzero vector in L(B)
  • BDD1/g 1/g-bounded distance decoding
  • Input a basis B (b1,,bn), a point t
  • Promise dist(t, L(B)) lt l1(L(B)) / g
  • Task find closest vector to t in L(B)

37
BDD1/g ? uSVPg/2
  • Input a basis B (b1,,bn), a point t
  • Assume that we know m dist(t, L(B))
  • Let B b1 bn t 0 0 m
  • Let v?L(B) be the closest to t, t-vm
  • Will show that the vector (t-v) mT is
    theg/2-unique shortest vector in L(B)
  • So uSVPg/2(B) will return it
  • The size of v(t-v) mT is (m2m2)1/2 2?m

Can get by with a good approximation for m
38
BDD1/g ? uSVPg/2 (cont.)
  • Every w?L(B) looks like wbt-w bmT
  • For some integer b and some w?L(B)
  • Write bt-w (bv-w)-b(v-t)
  • bv-w?L(B), nonzero if w isnt a multiple of v
  • So bv-w ? l1, also recall v-tm (? l1/g)
  • ?bt-w ? bv-w - bv-t ? l1-bm
  • ?w2 ? (l1-bm)2 (bm)2 ? infb?R(l1-bm)2(bm)2
    (l1)2/2 ? (gm)2/2
  • So for any w?L(B), not a multiple of v,we
    have w ? mg/ 2 v ? g/2

39
uSVPg ? BDD1/g
  • Input a basis B (b1,b2,,bn)
  • Let r be a prime, r?g
  • For i1,2,,n, j1,2,,p-1
  • Bi (b1,b2,,r?bi,,bn), tij j?bi
  • Let vij BDD1/g(Bi,Tij), wij vij tij
  • Output the smallest nonzero wij in L(B)

40
uSVPg ? BDD1/g (cont.)
  • Let u be shortest nonzero vector in L(B)
  • u S xibi , at least one xi isnt divisible by r
    (otherwise u/r would also be in L(B))
  • Let j -xi mod r, j?1,2,,r-1
  • We will prove that for these i,j
  • l1(L(Bi)) gt gl1(L(B))
  • dist(tij, L(Bi)) ? l1(L(B))

41
  • The smallest multiple of u in L(Bi) is ru
  • ru r l1(L(B)) ? g l1(L(B))
  • Any other vector in L(Bi)?L(B) is longer than g
    l1(L(B)) (since L(B) is g-unique)
  • ? l1(L(Bi)) ? g l1(L(B))
  • tiju jbiS xmbm (jxi)biSm?i xmbm ?L(Bi)
  • ?dist(tij,L(Bi)) ? l1(L(Bi))
  • ? (Bi,tij) satisfies the promise of BDD1/g
  • ? vijBDD1/g(Bi,tij) is closest to tij in L(Bi)
  • wij vijtij ? L(B), since tij?L(B) and
    vij?L(Bi)?L(B)
  • wijl1(L(B))

divisible by p
42
Reminder GapSVP
  • GapSVPg decision version of approxg-SVP
  • Input Basis B, number d
  • Promise either l1(L(B))?d or l1(L(B))gtgd
  • Task decide which is the case
  • The reduction uSVPg ? GapSVPg is the same as
    Regevs Decision-to-Search uSVP reduction

(slide 47)
43
GapSVPg n log n ? BDD1/g
  • Inputs Basis B(b1,,bn), number d
  • Repeat poly(n) times
  • Choose a random si of length ? d n log n
  • Set ti si mod B, run viBDD1/g(B,ti)
  • Answer YES if ?i s.t. v?ti-si, else NO
  • Need will show
  • l1(L(B))gtgd n log n? vti-si always
  • l1(L(B))?d ? v?ti-si with probability 1/2

44
Case 1 l1(L(B))gtg n log n d
  • Recall si?d n log n, tisi mod B
  • ? ti is ?d n log n away from vi ti-si?L(B)
  • ? (B,ti) satisfies the promise of BDD1/g
  • ? BDD1/g(B,ti) will return some vector in L(B)
  • Any other L(B) point has distance from ti at
    least l1(L(B))-d n log n gt (g-1)d n log n
  • ? vi is only answer that BDD1/g(B,ti) can return

45
Case 2 l1(L(B))?d
  • Let u be shortest nonzero in L(B), ul1
  • si is random in Ball(d n log n)
  • With high probability si?u also in ball
  • tisi mod B could just as wellbe chosen as
    ti(siu) mod B
  • Whatever BDD1/g(B,t) returnsit differs from
    ti-si w.p. ? 1/2

s
radius d n log n
u
46
Backup Slides
  • Regevs Decision-to-Search uSVP
  • Regevs dimension reduction
  • Diophantine Approximation

47
uSVP Decision?Search
?
Search-uSVP
Decision mod-pproblem
Decision-uSVP
48
Reduction fromDecision mod-p
?
  • Given a basis (v1vn) for n1.5-unique lattice,
    and a prime pgtn1.5
  • Assume the shortest vector is
  • u a1v1a2v2anvn
  • Decide whether a1 is divisible by p

49
Reduction toDecision uSVP
?
  • Given a lattice, distinguish between
  • Case 1. Shortest vector is of length 1/n and all
    non-parallel vectors are of length more than ?n
  • Case 2. Shortest vector is of length more than ?n

50
The reduction
?
  • Input a basis (v1,,vn) of a n1.5 unique lattice
  • Scale the lattice so that the shortest vector is
    of length 1/n
  • Replace v1 by pv1. Let M be the resulting lattice
  • If p a1 then M has shortest vector 1/n and all
    non-parallel vectors more than ?n
  • If p a1 then M has shortest vector more than ?n

51
The input lattice L
?
L
1/n
?n
-u
0
u
2u
52
The lattice M
?
  • The lattice M is spanned by pv1,v2,,vn
  • If pa1, then u (a1/p)pv1 a2v2 anvn ?M

M
?n
1/n
0
u
53
The lattice M
?
  • The lattice M is spanned by pv1,v2,,vn
  • If p a1, then u?M

M
?n
-pu
0
pu
54
uSVP Decision?Search
?
Search-uSVP
Decision mod-pproblem
?
Decision-uSVP
55
Reduction fromDecision mod-p
?
  • Given a basis (v1vn) for n1.5-unique lattice,
    and a prime pgtn1.5
  • Assume the shortest vector is
  • u a1v1a2v2anvn
  • Decide whether a1 is divisible by p

56
The Reduction
?
  • Idea decrease the coefficients of the shortest
    vector
  • If we find out that pa1 then we can replace the
    basis with pv1,v2,,vn .
  • u is still in the new lattice
  • u (a1/p)pv1 a2v2 anvn
  • The same can be done whenever pai for some i

57
The Reduction
?
  • But what if p ai for all i ?
  • Consider the basis v1,v2-v1,v3,,vn
  • The shortest vector is
  • u (a1a2)v1 a2(v2-v1) a3v3 anvn
  • The first coefficient is a1a2
  • Similarly, we can set it to
  • a1-bp/2ca2 ,, a1-a2 , a1 , a1a2 , ,
    a1bp/2ca2
  • One of them is divisible by p, so we choose it
    and continue

58
The Reduction
  • Repeating this process decreases the coefficients
    of u are by a factor of p at a time
  • The basis that we started from had coefficients ?
    22n
  • The coefficients are integers
  • ?After ? 2n2 steps, all the coefficient but one
    must be zero
  • The last vector standing must be ?u

59
Regevs dimension reduction
60
Reducing from n to 1-dimension
?
  • Distinguish between the 1-dimensional
    distributions

Uniform
0
R-1
Wavy
0
R-1
61
Reducing from n to 1-dimension
?
  • First attempt sample and project to a line

62
Reducing from n to 1-dimension
?
  • But then we lose the wavy structure!
  • We should project only from points very close to
    the line

63
The solution
?
  • Use the periodicity of the distribution
  • Project on a dense line

64
The solution
?
65
The solution
?
  • We choose the line that connects the origin to
    e1Ke2K2e3Kn-1en where K is large enough
  • The distance between hyperplanes is n
  • The sides are of length 2n
  • Therefore, we choose K2O(n)
  • Hence, dltO(Kn)2(O(n2))

66
Worst-case vs. Average-case
?
  • So far a problem that is hard in the worst-case
    distinguish between uniform and d,?-wavy
    distributions for all integers dlt2(n2)
  • For cryptographic applications, we would like to
    have a problem that is hard on the average
    distinguish between uniform and d,?-wavy
    distributions for a non-negligible fraction of d
    in 2(n2), 22(n2)

67
Compressing
?
  • The following procedure transforms d,?-wavy into
    2d,?-wavy for all integer d
  • Sample a from the distribution
  • Return either a/2 or (aR)/2 with probability ½
  • In general, for any real a?1, we can compress
    d,?-wavy into ad,?-wavy
  • Notice that compressing preserves the uniform
    distribution
  • We show a reduction from worst-case to
    average-case

68
Reduction
?
  • Assume there exists a distinguisher between
    uniform and d,?-wavy distribution for some
    non-negligible fraction of d in 2(n2),
    22(n2)
  • Given either a uniform or a d,?-wavy distribution
    for some integer dlt2(n2) repeat the following
  • Choose a in 1,,2?2(n2) according to a certain
    distribution
  • Compress the distribution by a
  • Check the distinguishers acceptance probability
  • If for some a the acceptance probability differs
    from that of uniform sequences, return wavy
    otherwise, return uniform

69
Reduction
?
  • Distribution is uniform
  • After compression it is still uniform
  • Hence, the distinguishers acceptance probability
    equals that of uniform sequences for all a
  • Distribution is d,?-wavy
  • After compression it is in the good range with
    some probability
  • Hence, for some a, the distinguishers
    acceptance probability differs from that of
    uniform sequences

2(n2)
2?2(n2)
1


d
70
Diophantine Approximation
71
Solving for u(from slide 24)
  • Recall We have B(b1,bn) and u
  • Shortest vector u?L(B) is u Smibi, mi lt 2n
  • Because the basis B is LLL reduced
  • u is very very close to u/u
  • u/u (ue), e1/N, N ? 2n (e.g., N2n )
  • Express u S xibi (xis are reals)
  • Set ni xi/xn for i1,,n-1
  • ni very very close to mi/mn ( nimn miO(2n/N)
    )

2
72
Diophantine Approximation
  • Look for mnlt2n s.t. for all i, nimn is 2n/N away
    from an integer (for N 2n )
  • z is the uniqueshortest in L(M)by a factorN/2n
  • Use LLL to find it
  • Compute the mis and u

2
1 n1 1 n2
1 nn-1 1/N
m1 m2 mn
O(2n/N) O(2n/N) ... O(2n/N) O(2n/N)

basis M
integer vector
short lattice point z
73
Why is z unique-shortest?
  • Assume we have another short vector y?L(M)
  • mn not much larger than 2n, also the other mis
  • Every small y?L(M) corresponds to v?L(B) such
    that v/v very very close to u
  • So also v/v very very close to u/u (2n/N)
  • Smallish coefficient ? v not too long (22n)
  • ? v very close to its projection on u (23n/N)
  • ? ? c s.t. (vcu)?L(B) is short
  • Of length ? 23n/N l1/2 lt l1
  • ? v must be a multiple of u

q2n/N
v22n
u
v
23n/N
Write a Comment
User Comments (0)
About PowerShow.com