Cryptography and Privacy Preserving Operations Lecture 1 - PowerPoint PPT Presentation

1 / 126
About This Presentation
Title:

Cryptography and Privacy Preserving Operations Lecture 1

Description:

If Alice wants to approve and Eve does not interfere Bob moves to state Y. If Alice does not approve, then for any behavior from Eve, Bob stays in N ... – PowerPoint PPT presentation

Number of Views:136
Avg rating:3.0/5.0
Slides: 127
Provided by: csI88
Category:

less

Transcript and Presenter's Notes

Title: Cryptography and Privacy Preserving Operations Lecture 1


1
Cryptography and Privacy Preserving Operations
Lecture 1
  • Lecturer Moni Naor
  • Weizmann Institute of Science

2
What is Cryptography?
Traditionally how to maintain secrecy in
communication
Alice and Bob talk while Eve tries to listen
Bob
Alice
Eve
3
History of Cryptography
  • Very ancient occupation
  • Many interesting books and sources, especially
    about the Enigma
  • David Kahn, The Codebreakers, 1967
  • Gaj and Orlowski, Facts and Myths of Enigma
    Breaking Stereotypes, Eurocrypt 2003
  • Not the subject of this course

4
Modern Times
  • Up to the mid 70s - mostly classified military
    work
  • Exception Shannon, Turing
  • Since then - explosive growth
  • Commercial applications
  • Scientific work tight relationship with
    Computational Complexity Theory
  • Major works Diffie-Hellman, Rivest, Shamir and
    Adleman (RSA)
  • Recently - more involved models for more diverse
    tasks.
  • How to maintain the secrecy, integrity and
    functionality in computer and communication
    system.

5
Cryptography and Complexity
  • Complexity Theory -
  • Study the resources needed to solve computational
    problems
  • computer time, memory
  • Identify problems that are infeasible to
    compute.
  • Cryptography -
  • Find ways to specify security requirements of
    systems
  • Use the computational infeasibility of problems
    in order to obtain security.

The development of these two areas is tightly
connected! The interplay between these areas is
the subject of the course
6
Key Idea of Cryptography
  • Use the intractability of some problems for the
    advantage of constructing secure system

7
Short Course Outline
  • First part of this short course
  • Alice and Bob want to cooperate
  • Eve wants to interfere
  • One-way functions,
  • pseudo-random generators
  • pseudo-random functions
  • Encryption
  • Second part
  • Alice and bob do not quite trust each other
  • Zero-knowledge protocols
  • Secure function evaluation

8
Three Basic Issues in Cryptography
  • Identification
  • Authentication
  • Encryption

9
Example Identification
  • When the time is right, Alice wants to send an
    approve message to Bob.
  • They want to prevent Eve from interfering
  • Bob should be sure that Alice indeed approves

Alice
Bob
Eve
10
Rigorous Specification of Security
  • To define security of a system must specify
  • What constitute a failure of the system
  • The power of the adversary
  • computational
  • access to the system
  • what it means to break the system.

11
Specification of the Problem
  • Alice and Bob communicate through a channel
  • Bob has two external states N,Y
  • Eve completely controls the channel
  • Requirements
  • If Alice wants to approve and Eve does not
    interfere Bob moves to state Y
  • If Alice does not approve, then for any behavior
    from Eve, Bob stays in N
  • If Alice wants to approve and Eve does interfere
    - no requirements from the external state

12
Can we guarantee the requirements?
  • No when Alice wants to approve she sends (and
    receives) a finite set of bits on the channel.
    Eve can guess them.
  • To the rescue - probability.
  • Want that Eve will succeed with low probability.
  • How low? Related to the string length that Alice
    sends

13
Example Identification
X
X
Alice
Bob
??
Eve
14
Suppose there is a setup period
  • There is a setup where Alice and Bob can agree on
    a common secret
  • Eve only controls the channel, does not see the
    internal state of Alice and Bob (only external
    state of Bob)
  • Simple solution
  • Alice and Bob choose a random string X ?R 0,1n
  • When Alice wants to approve she sends X
  • If Bob gets any symbols on channel compares to
    X
  • If equal moves to Y
  • If not equal moves permanently to N

15
Eves probability of success
  • If Alice did not send X and Eve put some string
    X on the channel, then
  • Bob moves to Y only if X X
  • ProbXX 2-n
  • Good news can make it a small as we wish
  • What to do if Alice and Bob cannot agree on a
    uniformly generated string X?

16
Less than perfect random variables
  • Suppose X is chosen according to some
    distribution Px over some set of symbols G
  • What is Eves best strategy?
  • What is her probability of success

17
(Shannon) Entropy
  • Let X be random variable over alphabet G with
    distribution Px
  • The (Shannon) entropy of X is
  • H(X) - ? x ?G Px (x) log Px (x)
  • Where we take 0 log 0 to be 0.
  • Represents how much we can compress X

18
Examples
  • If X0 (constant) then H(x) 0
  • Only case where H(x) 0 is when x is constant
  • All other cases H(x) gt0
  • If X ? 0,1 and ProbX0 p and
    ProbX11-p, then
  • H(X) -p log p (1-p) log (1-p) H(p)
  • If X ? 0,1n and is uniformly distributed,
    then
  • H(X) - ? x ? 0,1n 1/2n log 1/2n 2n/2n n
    n

19
Properties of Entropy
  • Entropy is bounded H(X) log G with equality
    only if X is uniform over G

20
Does High Entropy Suffice for Identification?
  • If Alice and bob agree on X ? 0,1n where X has
    high entropy (say H(X) n/2 ), what are Eves
    chances of cheating?
  • Can be high say
  • ProbX0n 1/2
  • For any x? 10,1 n-1 ProbXx 1/2n
  • Then H(X) n/21/2
  • But Eve can cheat with probability at least ½ by
    guessing that X0n

21
Another Notion Min Entropy
  • Let X be random variable over alphabet G with
    distribution Px
  • The min entropy of X is
  • Hmin(X) - log max x ?G Px (x)
  • The min entropy represents the most likely value
    of X
  • Property Hmin(X) H(X)
  • Why?

22
High Min Entropy and Passwords
  • Claim if Alice and Bob agree on such that
  • Hmin(X) m, then the probability that Eve
    succeeds in cheating is at most 2-m
  • Proof Make Eve deterministic, by picking her
    best choice, X x.
  • ProbXx Px (x) max x ?G Px (x) 2
    Hmin(X) 2-m
  • Conclusion passwords should be chosen to have
    high min-entropy!

23
  • Good source on Information Theory
  • T. Cover and J. A. Thomas, Elements of
    Information Theory

24
One-time vs. many times
  • This was good for a single identification. What
    about many identification?
  • Later

25
A different scenario now Charlie is involved
  • Bob has no proof that Alice indeed identified
  • If there are two possible verifiers, Bob and
    Charlie, they can each pretend to each other to
    be Alice
  • Can each have there own string
  • But, assume that they share the setup phase
  • Whatever Bob knows Charlie know
  • Relevant when they are many of possible verifiers!

26
The new requirement
  • If Alice wants to approve and Eve does not
    interfere Bob moves to state Y
  • If Alice does not approve, then for any behavior
    from Eve and Charlie, Bob stays in N
  • Similarly if Bob and Charlie are switched

Charlie
Alice
Bob
Eve
27
Can we achieve the requirements?
  • Observation what Bob and Charlie received in
    the setup phase might as well be public
  • Therefore can reduce to the previous scenario
    (with no setup)
  • To the rescue - complexity
  • Alice should be able to perform something that
    neither Bob nor Charlie (nor Eve) can do
  • Must assume that the parties are not
    computationally all powerful!

28
Function and inversions
  • We say that a function f is hard to invert if
    given yf(x) it is hard to find x such that
    yf(x)
  • x need not be equal to x
  • We will use f-1(y) to denote the set of preimages
    of y
  • To discuss hard must specify a computational
    model
  • Use two flavors
  • Concrete
  • Asymptotic

29
One-way functions - asymptotic
  • A function f 0,1 ? 0,1 is called a
    one-way function, if
  • f is a polynomial-time computable function
  • Also polynomial relationship between input and
    output length
  • for every probabilistic polynomial-time algorithm
    A, every positive polynomial p(.), and all
    sufficiently large ns
  • ProbAf(x) ? f-1(f(x)) 1/p(n)
  • Where x is chosen uniformly in 0,1n and the
    probability is also over the internal coin flips
    of A

30
One-way functions concrete version
  • A function f0,1n ? 0,1n is called a (t,e)
    one-way function, if
  • f is a polynomial-time computable function
    (independent of t)
  • for every t-time algorithm A,
  • ProbAf(x) ? f-1(f(x)) e
  • Where x is chosen uniformly in 0,1n and the
    probability is also over the internal coin flips
    of A
  • Can either think of t and e as being fixed or as
    t(n), e(n)

31
Complexity Theory and One-way Functions
  • Claim if PNP then there are no one-way
    functions
  • Proof for any one-way function
  • f 0,1n ? 0,1n
  • consider the language Lf
  • Consisting of strings of the form y, b1, b2bk
  • There is an x ? 0,1n such that yf(x) and
  • The first k bits of x are b1, b2bk
  • Lf is NP guess x and check
  • If Lf is P then f is invertible in polynomial
    time
  • Self reducibility

32
A few properties and questions concerning one-way
functions
  • Major open problem connect the existence of
    one-way functions and the PNP? question
  • If f is one-to-one it is a called a one-way
    permutation. In what complexity class does the
    problem of inverting one-way permutations reside?
  • good exercise!
  • If f is a one-way function, is f where f(x)
    is f(x) with the last bit chopped a one-way
    function?
  • If f is a one-way function, is fL where fL(x)
    consists of the first half of the bits of f(x) a
    one-way function?
  • good exercise!
  • If f is a one way function is g(x) f(f(x))
    necessarily a one-way function?
  • good exercise!

33
Solution to the password problem
  • Assume that
  • f 0,1n ? 0,1n is a (t,e) one-way function
  • Adversarys run times is bounded by t
  • Setup phase
  • Alice chooses x?R 0,1n
  • computes yf(x)
  • Gives y to Bob and Charlie
  • When Alice wants to approve she sends x
  • If Bob gets any symbols on channel call them z
    compute f(z) and compares to y
  • If equal moves to state Y
  • If not equal moves permanently to state N

34
Eves and Charlies probability of success
  • If Alice did not send x and Eve (Charlie) put
    some string x on the channel to Bob, then
  • Bob moves to state Y only if f(x)yf(x)
  • But we know that
  • ProbAf(x) ? f-1(f(x)) e
  • or else we can use Eve to break the
    one-way function
  • Good news if e can be made as small as we wish,
    then we have a good scheme.
  • Can be used for monitoring
  • Similar to the Unix password scheme
  • f(x) stored in login file
  • DES used as the one-way function.

A
y
Eve
y
y
x
x
35
Reductions
  • This is a simple example of a reduction
  • Simulate Eves algorithm in order to break the
    one-way function
  • Most reductions are much more involved

36
Cryptographic Reductions
  • Show how to use an adversary for breaking
    primitive 1 in order to break primitive 2
  • Important
  • Run time how does T1 relate to T2
  • Probability of success how does ?1 relate to ?2
  • Access to the system 1 vs. 2

37
Are one-way functions essential to the two guards
password problem?
  • Precise definition
  • for every probabilistic polynomial-time algorithm
    A controlling Eve and Charlie
  • every polynomial p(.),
  • and all sufficiently large ns
  • ProbBob moves Y Alice does not approve
    1/p(n)
  • Recall observation what Bob and Charlie
    received in the setup phase might as well be
    public
  • Claim can get rid of interaction
  • given an interactive identification protocol
    possible to construct a noninteractive one. In
    new protocol
  • Alice sends Bob the random bits Alice used to
    generate the setup information
  • Bob simulates the conversation between Alice
    and Bob in original protocol and accepts only if
    simulated Bob accepts.
  • Probability of cheating is the same

38
One-way functions are essential to the two guards
password problem
  • Are we done? Given a noninteracive
    identification protocol want to define a one-way
    function
  • Define function f(r) as the mapping that Alice
    does in the setup phase between her random bits r
    and the information y given to Bob and Charlie
  • Problem the function f(r) is not necessarily
    one-way
  • Can be unlikely ways to generate it. Can be
    exploited to invert.
  • Example Alice chooses x, x? 0,1n if x 0n
    set yx o.w. set yf(x)
  • The protocol is still secure, but with
    probability 1/2n not complete
  • The resulting function f(x,x) is easy to invert
  • given y ?0,1n set inverse as (y, 0n )

39
One-way functions are essential to the two guards
password problem
  • However possible to estimate the probability
    that Bob accepts on a given string from Alice
  • Second attempt define function f(r) as
  • the mapping that Alice does in the setup phase
    between her random bits r and the information
    given to Bob and Charlie,
  • plus a bit indicating that probability of Bob
    accepts given r is greater than 2/3
  • Theorem the two guards password problem has a
    solution if and only if one-way functions exist

40
Examples of One-way functions
  • Examples of hard problems
  • Subset sum
  • Discrete log
  • Factoring (numbers, polynomials) into prime
    components
  • How do we get a one-way function out of them?

Easy problem
41
Subset Sum
  • Subset sum problem given
  • n numbers 0 a1, a2 ,, an 2m
  • Target sum T
  • Find subset S? 1,...,n ? i ?S ai,T
  • (n,m)-subset sum assumption for uniformly chosen
  • a1, a2 ,, an ?R0,2m -1 and S? 1,...,n
  • For any probabilistic polynomial time algorithm,
    the probability of finding S? 1,...,n such
    that
  • ? i ?S ai ? i ?S ai
  • is negligible, where the probability is over the
    random choice of the ais, S and the inner coin
    flips of the algorithm
  • Subset sum one-way function f0,1mnn ? 0,1m
  • f(a1, a2 ,, an , b1, b2 ,, bn )
  • (a1, a2 ,, an , ? i1n bi ai mod 2m )

42
Exercise
  • Show a function f such that
  • if f is polynomial time invertible on all
    inputs, then PNP
  • f is not one-way

43
Discrete Log Problem
  • Let G be a group and g an element in G.
  • Let ygz and x the minimal non negative
  • integer satisfying the equation.
  • x is called the discrete log of y to base g.
  • Example ygx mod p in the multiplicative group
    of Zp
  • In general easy to exponentiate via repeated
    squaring
  • Consider binary representation
  • What about discrete log?
  • If difficult, f(g,x) (g, gx ) is a one-way
    function

44
Integer Factoring
  • Consider f(x,y) x y
  • Easy to compute
  • Is it one-way?
  • No if f(x,y) is even can set inverse as
    (f(x,y)/2,2)
  • If factoring a number into prime factors is hard
  • Specifically given N P Q , the product of two
    random large (n-bit) primes, it is hard to factor
  • Then somewhat hard there are a non-negligible
    fraction of such numbers 1/n2 from the
    density of primes
  • Hence a weak one-way function
  • Alternatively
  • let g(r) be a function mapping random bits into
    random primes.
  • The function f(r1,r2) g(r1) g(r2) is one-way

45
Weak One-way function
  • A function f 0,1n ? 0,1n is called a weak
    one-way function, if
  • f is a polynomial-time computable function
  • There exists a polynomial p(), for every
    probabilistic polynomial-time algorithm A, and
    all sufficiently large ns
  • ProbAf(x) ? f-1(f(x)) 1-1/p(n)
  • Where x is chosen uniformly in 0,1n and the
    probability is also over the internal coin flips
    of A

46
Exercise weak exist if strong exists
  • Show that if strong one-way functions exist, then
    there exists a a function which is a weak one-way
    function but not a strong one

47
What about the other direction?
  • Given
  • a function f that is guaranteed to be a weak
    one-way
  • Let p(n) be such that ProbAf(x) ? f-1(f(x))
    1-1/p(n)
  • can we construct a function g that is (strong)
    one-way?
  • An instance of a hardness amplification problem
  • Simple idea repetition. For some polynomial q(n)
    define
  • g(x1, x2 ,, xq(n) )f(x1), f(x2), , f(xq(n))
  • To invert g need to succeed in inverting f in all
    q(n) places
  • If q(n) p2(n) seems unlikely (1-1/p(n))p2(n)
    e-p(n)
  • But how to we show? Sequential repetition
    intuition not a proof.

48
Want Inverting g with low probability implies
inverting f with high probability
  • Given a machine A that inverts g want a machine
    A
  • operating in similar time bounds
  • inverts f with high probability
  • Idea given yf(x) plug it in some place in g and
    generate the rest of the locations at random
  • z(y, f(x2), , f(xq(n)))
  • Ask machine A to invert g at point z
  • Probability of success should be at least
    (exactly) As Probability of inverting g at a
    random point
  • Once is not enough
  • How to amplify?
  • Repeat while keeping y fixed
  • Put y at random position (or sort the inputs to
    g )

49
Proof of Amplification for Repetition of Two
  • Concentrate on repetition of two g(x1, x2
    )f(x1), f(x2)
  • Goal show that the probability of inverting g is
    roughly squared the probability of inverting f
  • just as would be sequentially
  • Claim
  • Let ?(n) be a function that for some
    p(n) satisfies
  • 1/p(n) ?(n) 1-1/p(n)
  • Let e(n) be any inverse polynomial
    function
  • suppose that for every polynomial
    time A and sufficiently large n
  • ProbAf(x) ? f-1(f(x)) ?(n)
  • Then for every polynomial time B and
    sufficiently large n
  • ProbBg(x1, x2 ) ? g-1(g(x1, x2 )) ?2(n)
    e(n)

50
Proof of Amplification for Two Repetition
  • Suppose not, then given a better than ?2e
    algorithm B for inverting g construct the
    following
  • B(y) Inversion algorithm for f
  • Repeat t times
  • Choose x at random and compute yf(x)
  • Run B(y,y).
  • Check the results
  • If correct Halt with success
  • Output failure

Inner loop
Helpful for constructive algorithm
51
Probability of Success
  • Define
  • Syf(x) ProbInner loop successful y gt ß
  • Since the choices of the x are independent
  • ProbB succeeds x?S gt 1-(1- ß)t
  • Taking t n/ß means that when y?S almost surely
    A will invert it
  • Hence want to show that Prob y?S gt ?(n)

52
The success of B
  • Fix the random bits of B. Define
  • P(y1, y2) B succeeds on (y1,y2)
  • P P ? (y1,y2 ) y1,y2 ?S
  • ? P ? (y1,y2 ) y1 ?S
  • ? P ? (y1,y2 ) y2 ?S

y1
Well behaved part
y2
want to bound P by a square
P
53
S is the only success..
  • But
  • ProbBy1, y2 ? g-1(y1, y2) y1 ?S ß
  • and similarly
  • ProbBy1, y2 ? g-1(y1, y2) y2 ?S ß
  • so
  • Prob(y1, y2) ?P and y1,y2 ?S
  • Prob(y1, y2) ?P - 2ß
  • ?2 e - 2ß
  • Setting ß e/3 we have
  • Prob(y1, y2) ?P and y1,y2 ?S ?2 e/3

54
Contradiction
  • But
  • Prob(y1, y2) ?P and y1,y2 ?S
  • Proby1 ?S Proby2 ?S
  • Prob2y ?S
  • So
  • Proby ?S v(a2 e/3) gt a

55
The Encryption problem
  • Alice would want to send a message m ? 0,1n
    to Bob
  • Set-up phase is secret
  • They want to prevent Eve from learning anything
    about the message

m
Alice
Bob
Eve
56
The encryption problem
  • Relevant both in the shared key and in the public
    key setting
  • Want to use many times
  • Also add authentication
  • Other disruptions by Eve

57
What does learn mean?
  • If Eve has some knowledge of m should remain the
    same
  • Probability of guessing m
  • Min entropy of m
  • Probability of guess whether m is m0 or m1
  • Probability of computing some function f of m
  • Ideally the message sent is a independent of the
    message m
  • Implies all the above
  • Shannon achievable only if the entropy of the
    shared secret is at least as large as the message
    m entropy
  • If no special knowledge about m
  • then m
  • Achievable one-time pad.
  • Let r?R 0,1n
  • Think of r and m as elements in a group
  • To encrypt m send rm
  • To decrypt z send mz-r

58
Pseudo-random generators
  • Would like to stretch a short secret (seed) into
    a long one
  • The resulting long string should be usable in
    any case where a long string is needed
  • In particular as a one-time pad
  • Important notion Indistinguishability
  • Two probability distributions that cannot be
    distinguished
  • Statistical indistinguishability distances
    between probability distributions
  • New notion computational indistinguishability

59
...References
  • Books
  • O. Goldreich, Foundations of Cryptography - a
    book in three volumes.
  • Vol 1, Basic Tools, Cambridge, 2001
  • Pseudo-randomness, zero-knowledge
  • Vol 2, about to come out
  • (Encryption, Secure Function Evaluation)
  • Other volumes in www.wisdom.weizmann.ac.il/oded/b
    ooks.html
  • M. Luby, Pseudorandomness and Cryptographic
    Applications, Princeton University Pres
  • ,

60
References
  • Web material/courses
  • S. Goldwasser and M. Bellare, Lecture Notes on
    Cryptography,
  • http//www-cse.ucsd.edu/mihir/papers/gb.html
  • Wagner/Trevisan, Berkeley
  • www.cs.berkeley.edu/daw/cs276
  • Ivan Damgard and Ronald Cramer, Cryptologic
    Protocol Theory
  • http//www.daimi.au.dk/ivan/CPT.html
  • Salil Vadhan, Pseudorandomness
  • http//www.courses.fas.harvard.edu/cs225/Lectures
    -2002/
  • Naor, Foundations of Cryptography and Estonian
    Course
  • www.wisdom.weizmann.ac.il/naor

61
Recap of Lecture 1
  • Key idea of cryptography use computational
    intractability for your advantage
  • One-way functions are necessary and sufficient to
    solve the two guard identification problem
  • Notion of Reduction between cryptographic
    primitives
  • Amplification of weak one-way functions
  • Things are a bit more complex in the
    computational world (than in the information
    theoretic one)

62
Is there an ultimate one-way function?
  • If f10,1 ? 0,1 and f20,1 ? 0,1 are
    guaranteed to
  • Be polynomial time computable
  • At least one of them is one-way.
  • then can construct a function g0,1 ? 0,1
    which is one-way
  • g(x1, x2 ) (f1(x1),f2 (x2 ))
  • If an 5n2 time one-way function is guaranteed to
    exist, can construct an O(n2 log n) one-way
    function g
  • Idea enumerate Turing Machine and make sure they
    run 5n2 steps
  • g(x1, x2 ,, xlog (n) )M1(x1), M2(x2), , Mlog
    n(xlog (n))
  • If a one-way function is guaranteed to exist,
    then there exists a 5n2 time one-way
  • Idea concentrate on the prefix

1/p(n)
63
Conclusions
  • Be careful what you wish for
  • Problem with resulting one-way function
  • Cannot learn about behavior on large inputs from
    small inputs
  • Whole rational of considering asymptotic results
    is eroded
  • Construction does not work for non-uniform
    one-way functions
  • Encryption easy when you share very long strings
  • Started with the notion of pseudo-randomness

64
The Encryption problem
  • Alice would want to send a message m ? 0,1n
    to Bob
  • Set-up phase is secret
  • They want to prevent Eve from learning anything
    about the message

m
Alice
Bob
Eve
65
The encryption problem
  • Relevant both in the shared key and in the public
    key setting
  • Want to use many times
  • Also add authentication
  • Other disruptions by Eve

66
What does learn mean?
  • If Eve has some knowledge of m should remain the
    same
  • Probability of guessing m
  • Min entropy of m
  • Probability of guess whether m is m0 or m1
  • Probability of computing some function f of m
  • Ideally the message sent is a independent of the
    message m
  • Implies all the above
  • Shannon achievable only if the entropy of the
    shared secret is at least as large as the message
    m entropy
  • If no special knowledge about m
  • then m
  • Achievable one-time pad.
  • Let r?R 0,1n
  • Think of r and m as elements in a group
  • To encrypt m send rm
  • To decrypt z send mz-r

67
Pseudo-random generators
  • Would like to stretch a short secret (seed) into
    a long one
  • The resulting long string should be usable in
    any case where a long string is needed
  • In particular as a one-time pad
  • Important notion Indistinguishability
  • Two probability distributions that cannot be
    distinguished
  • Statistical indistinguishability distances
    between probability distributions
  • New notion computational indistinguishability

68
Computational Indistinguishability
  • Definition two sequences of distributions Dn
    and Dn on 0,1n are computationally
    indistinguishable if
  • for every polynomial p(n) and sufficiently large
    n, for every probabilistic polynomial time
    adversary A that receives input y ? 0,1n and
    tries to decide whether y was generated by Dn or
    Dn
  • ProbA0 Dn - ProbA0 Dn lt
    1/p(n)
  • Without restriction on probabilistic polynomial
    tests equivalent to variation distance being
    negligible
  • ?ß ? 0,1n Prob Dn ß - Prob Dn ß lt
    1/p(n)

69
Pseudo-random generators
  • Definition a function g0,1 ? 0,1 is said
    to be a (cryptographic) pseudo-random generator
    if
  • It is polynomial time computable
  • It stretches the input g(x)gtx
  • denote by l(n) the length of the output on
    inputs of length n
  • If the input is random the output is
    indistinguishable from random
  • For any probabilistic polynomial time adversary A
    that receives input y of length l(n) and tries to
    decide whether y g(x) or is a random string from
    0,1l(n) for any polynomial p(n) and
    sufficiently large n
  • ProbArand yg(x) - ProbArand y?R
    0,1l(n) lt 1/p(n)
  • Important issues
  • Why is the adversary bounded by polynomial time?
  • Why is the indistinguishability not perfect?

70
Pseudo-random generators
  • Definition a function g0,1 ? 0,1 is said
    to be a (cryptographic) pseudo-random generator
    if
  • It is polynomial time computable
  • It stretches the input g(x)gtx
  • denote by l(n) the length of the output on
    inputs of length n
  • If the input (seed) is random, then the output is
    indistinguishable from random
  • For any probabilistic polynomial time adversary A
    that receives input y of length l(n) and tries to
    decide whether y g(x) or is a random string from
    0,1l(n) for any polynomial p(n) and
    sufficiently large n
  • ProbArand yg(x) - ProbArand y?R
    0,1l(n) lt 1/p(n)
  • Want to use the output a pseudo-random generator
    whenever long random strings are used
  • Especially encryption
  • have not defined the desired properties yet.

Anyone who considers arithmetical methods of
producing random numbers is, of course, in a
state of sin.

J. von Neumann
71
Important issues
  • Why is the adversary bounded by polynomial time?
  • Why is the indistinguishability not perfect?

72
Construction of pseudo-random generators
  • Idea given a one-way function there is a hard
    decision problem hidden there
  • If balanced enough looks random
  • Such a problem is a hardcore predicate
  • Possibilities
  • Last bit
  • First bit
  • Inner product

73
Hardcore Predicate
  • Definition let f0,1 ? 0,1 be a function.
    We say that h0,1 ? 0,1 is a hardcore
    predicate for f if
  • It is polynomial time computable
  • For any probabilistic polynomial time adversary A
    that receives input yf(x) and tries to compute
    h(x) for any polynomial p(n) and sufficiently
    large n
  • ProbA(y)h(x) -1/2 lt 1/p(n)
  • where the probability is over the choice y and
    the random coins of A
  • Sources of hardcoreness
  • not enough information about x
  • not of interest for generating pseudo-randomness
  • enough information about x but hard to compute
    it

74
Exercises
  • Assume one-way functions exist
  • Show that the last bit/first bit are not
    necessarily hardcore predicates
  • Generalization show that for any fixed function
    h0,1 ? 0,1 there is a one-way function
    f0,1 ? 0,1 such that h is not a hardcore
    predicate of f
  • Show a one-way function f such that given yf(x)
    each input bit of x can be guessed with
    probability at least 3/4

75
Single bit expansion
  • Let f0,1n ? 0,1n be a one-way permutation
  • Let h0,1n ? 0,1 be a hardcore predicate for
    f
  • Consider g0,1n ? 0,1n1 where
  • g(x)(f(x), h(x))
  • Claim g is a pseudo-random generator
  • Proof can use a distinguisher for g to guess
    h(x)

f(x), h(x))
f(x), 1-h(x))
76
Hardcore Predicate With Public Information
  • Definition let f0,1 ? 0,1 be a function.
    We say that h0,1 x 0,1 ? 0,1 is a
    hardcore predicate for f if
  • h(x,r) is polynomial time computable
  • For any probabilistic polynomial time adversary A
    that receives input yf(x) and public randomness
    r and tries to compute h(x,r) for any polynomial
    p(n) and sufficiently large n
  • ProbA(y,r)h(x,r) -1/2 lt 1/p(n)
  • where the probability is over the choice y of r
    and the random coins of A
  • Alternative view can think of the public
    randomness as modifying the one-way function f
    f(x,r)f(x),r.

77
Example weak hardcore predicate
  • Let h(x,i) xi
  • I.e. h selects the ith bit of x
  • For any one-way function f, no polynomial time
    algorithm A(y,i) can have probability of success
    better than 1-1/2n of computing h(x,i)
  • Exercise let c0,1 ? 0,1 be a good error
    correcting code
  • c(x) is O(x)
  • distance between any two codewords c(x) and c(x)
    is a constant fraction of c(x)
  • It is possible to correct in polynomial time
    errors in a constant fraction of c(x)
  • Show that for h(x,i) c(x)i and any one-way
    function f, no polynomial time algorithm A(y,i)
    can have probability of success better than a
    constant of computing h(x,i)

78
Inner Product Hardcore bit
  • The inner product bit choose r ?R 0,1n let
  • h(x,r) r x ? xi ri mod 2
  • Theorem Goldreich-Levin for any one-way
    function the inner product is a hardcore
    predicate
  • Proof structure
  • There are many xs for which A returns a correct
    answer on ½e of the r s
  • take an algorithm A that guesses h(x,r) correctly
    with probability ½e over the rs and output a
    list of candidates for x
  • No use of the y info
  • Choose from the list the/an x such that f(x)y

The main step!
79
Why list?
  • Cannot have a unique answer!
  • Suppose A has two candidates x and x
  • On query r it returns at random either r x
    or r x
  • ProbA(y,r) r x ½ ½Probrx rx ¾

80
Warm-up (1)
  • If A returns a correct answer on 1-1/2n of the r
    s
  • Choose r1, r2, rn ?R 0,1n
  • Run A(y,r1), A(y,r2), A(y,rn)
  • Denote the response z1, z2, zn
  • If r1, r2, rn are linearly independent then
  • there is a unique x satisfying rix zi for all
    1 i n
  • Probzi A(y,ri) rix 1-1/2n
  • Therefore probability that all the zis are
    correct is at least ½
  • Do we need complete independence of the ri s?
  • one-wise independence is sufficient
  • Can choose r ?R 0,1n and set ri rei
  • ei 0i-110n-i
  • All the ri s are linearly independent
  • Each one is uniform in 0,1n

81
Warm-up (2)
  • If A returns a correct answer on 3/4e of the r
    s
  • Can amplify the probability of success!
  • Given any r ? 0,1n Procedure A(y,r)
  • Repeat for j1, 2,
  • Choose r ?R 0,1n
  • Run A(y,rr) and A(y,r), denote the sum of
    responses by zj
  • Output the majority of the zjs
  • Analysis
  • Przj rx PrA(y,r)rx A(y,rr)(rr)x
    ½2e
  • Does not work for ½e since success on r and
    rr is not independent
  • Each one of the events zj rx is independent
    of the others
  • Therefore by taking sufficiently many js can
    amplify to a value as close to 1 as we wish
  • Need roughly 1/e2 examples
  • Idea for improvement fix a few of the r

82
The real thing
  • Choose r1, r2, rk ?R 0,1n
  • Guess for j1, 2, k the value zj rjx
  • Go over all 2k possibilities
  • For all nonempty subsets S ?1,,k
  • Let rS ? j? S rj
  • The implied guess for zS ? j? S zj
  • For each position xi
  • for each S ?1,,k run A(y,ei-rS)
  • output the majority value of zs A(y,ei-rS)
  • Analysis
  • Each one of the vectors ei-rS is uniformly
    distributed
  • A(y,ei-rS) is correct with probability at least
    ½e
  • Claim For every pair of nonempty subset S ?T
    ?1,,k
  • the two vectors rS and rT are pair-wise
    independent
  • Therefore variance is as in completely
    independent trials
  • I is the number of correct A(y,ei-rS), VAR(I)
    2k(½e)
  • Use Chebyshevs Inequality PrI-E(I)
    ?vVAR(I)1/?2
  • Need 2k n/e2 to get the probability of error
    to 1/n
  • So process is successful simultaneously for all
    positions xi,i?1,,n

S
T
83
Analysis
  • Number of invocations of A
  • 2k n (2k-1) poly(n, 1/e) n3/e4
  • Size of resulting list of candidates for x
  • for each guess of z1, z2, zk unique x
  • 2k poly(n, 1/e) ) n/e2
  • Conclusion single bit expansion of a one-way
    permutation is a pseudo-random generator

guesses
positions
subsets
84
Reducing the size of the list of candidates
  • Idea bootstrap
  • Given any r ? 0,1n Procedure A(y,r)
  • Choose r1, r2, rk ?R 0,1n
  • Guess for j1, 2, k the value zj rjx
  • Go over all 2k possibilities
  • For all nonempty subsets S ?1,,k
  • Let rS ? j? S rj
  • The implied guess for zS ? j? S zj
  • for each S ?1,,k run A(y,r-rS)
  • output the majority value of zs A(y,r-rS)
  • For 2k 1/e2 the probability of error is, say,
    1/8
  • Fix the same r1, r2, rk for subsequent
    executions
  • They are good for 7/8 of the rs
  • Run warm-up (2)
  • Size of resulting list of candidates for x is
    1/e2

85
Application Diffie-Hellman
  • The Diffie-Hellman assumption
  • Let G be a group and g an element in G.
  • Given g, agx and bgy it is hard to find cgxy
  • for random x and y is probability of poly-time
    machine outputting gxy is negligible
  • More accurately a sequence of groups
  • Dont know how to verify given c whether it is
    equal to gxy
  • Exercise show that under the DH Assumption
  • Given agx , bgy and r ? 0,1n no polynomial
    time machine can guess r gxy with advantage
    1/poly
  • for random x,y and r

86
Application if subset is one-way, then it is a
pseudo-random generator
  • Subset sum problem given
  • n numbers 0 a1, a2 ,, an 2m
  • Target sum y
  • Find subset S? 1,...,n ? i ?S ai,y
  • Subset sum one-way function f0,1mnn ? 0,1m
  • f(a1, a2 ,, an , x1, x2 ,, xn )
  • (a1, a2 ,, an , ? i1n xi ai mod 2m )
  • If mltn then we get out less bits then we put in.
  • If mgtn then we get out more bits then we put in.
  • Theorem if for mgtn subset sum is a one-way
    function, then it is also a pseudo-random
    generator

87
Subset Sum Generator
  • Idea of proof use the distinguisher A to compute
    r x
  • For simplicity do the computation mod P for
    large prime P
  • Given r ? 0,1n and (a1, a2 ,, an ,y)
  • Generate new problem(a1, a2 ,, an ,y)
  • Choose c ?R ZP
  • Let ai ai if ri0 and ai aic mod P if ri1
  • Guess k ?R o,,n - the value of ? xi ri
  • the number of locations where x and r are 1
  • Let y yc k mod P
  • Run the distinguisher A on (a1, a2 ,, an
    ,y)
  • output what A says Xored with parity(k)
  • Claim if k is correct, then (a1, a2 ,, an
    ,y) is ?R pseudo-random
  • Claim for any incorrect k, (a1, a2 ,, an
    ,y) is ?R random
  • y z (k-h)c mod P where z ? i1n xi ai mod
    P and h? xi ri
  • Therefore probability to guess correctly r x is
    1/n(½e) (n-1)/n (½) ½e/n

ProbA0pseudo ½e
ProbA0random ½
pseudo-random
random
correct k
incorrect k
88
Interpretations of the Goldreich-Levin Theorem
  • A tool for constructing pseudo-random generators
  • The main part of the proof
  • A mechanism for translating general confusion
    into randomness
  • Diffie-Hellman example
  • List decoding of Hadamard Codes
  • works in the other direction as well (for any
    code with good list decoding)
  • List decoding, as opposed to unique decoding,
    allows getting much closer to distance
  • Explains unique decoding when prediction was
    3/4e
  • Finding all linear functions agreeing with a
    function given in a black-box
  • Learning all Fourier coefficients larger than e
  • If the Fourier coefficients are concentrated on a
    small set can find them
  • True for AC0 circuits
  • Decision Trees

89
Composing PRGs
l1
  • Composition
  • Let
  • g1 be a (l1, l2 )-pseudo-random generator
  • g2 be a (l2, l3)-pseudo-random generator
  • Consider g(x) g2(g1(x))
  • Claim g is a (l1, l3 )-pseudo-random generator
  • Proof consider three distributions on 0,1l3
  • D1 y uniform in 0,1l3
  • D2 yg(x) for x uniform in 0,1l1
  • D3 yg2(z) for z uniform in 0,1l2
  • By assumption there is a distinguisher A between
    D1 and D2
  • A must either
  • distinguish between D1 and D3 - can use A use
    to distinguish g2
  • or
  • distinguish between D2 and D3 - can use A use
    to distinguish g1

l2
l3
triangle inequality
90
Composing PRGs
  • When composing
  • a generator secure against advantage e1
  • and a
  • a generator secure against advantage e2
  • we get security against advantage e1e2
  • When composing the single bit expansion generator
    n times
  • Loss in security is at most e/n
  • Hybrid argument to prove that two distributions
    D and D are indistinguishable
  • suggest a collection of distributions D D0, D1,
    Dk D such that
  • If D and D can be distinguished, there is a
    pair Di and Di1 that can be distinguished.
  • Difference e between D and D means e/k between
    some Di and Di1
  • Use such a distinguisher to derive a contradiction

91
Exercise
  • Let Dn and Dn be two distributions that
    are
  • Computationally indistinguishable
  • Polynomial time samplable
  • Suppose that y1, ym are all sampled according
    to Dn or all are sampled according to Dn
  • Prove no probabilistic polynomial time machine
    can tell, given y1, ym, whether they were
    sampled from Dn or Dn

92
Existence of PRGs
  • What we have proved
  • Theorem if pseudo-random generators stretching
    by a single bit exist, then pseudo-random
    generators stretching by any polynomial factor
    exist
  • Theorem if one-way permutations exist, then
    pseudo-random generators exist
  • A harder theorem to prove
  • Theorem HILL if one-way functions exist, then
    pseudo-random generators exist
  • Exercise show that if pseudo-random generators
    exist, then one-way functions exist

93
Next-bit Test
  • Definition a function g0,1 ? 0,1 is said
    to pass the next bit test if
  • It is polynomial time computable
  • It stretches the input g(x)gtx
  • denote by l(n) the length of the output on
    inputs of length n
  • If the input (seed) is random, then the output
    passes the next-bit test
  • For any prefix 0 ilt l(n), for any probabilistic
    polynomial time adversary A that receives the
    first i bits of y g(x) and tries to guess the
    next bit, or any polynomial p(n) and sufficiently
    large n
  • ProbA(yi,y2,, yi) yi1 1/2 lt 1/p(n)
  • Theorem a function g0,1 ? 0,1 passes the
    next bit test if
  • and only if it is a pseudo-random generator

94
Next-block Undpredictable
  • Suppose that the function G maps a given a seed
    into a sequence of blocks
  • let l(n) be the length of the number of blocks
    given a seed of length n
  • If the input (seed) is random, then the output
    passes the next-block unpredicatability test
  • For any prefix 0 ilt l(n), for any probabilistic
    polynomial time adversary A that receives the
    first i blocks of y g(x) and tries to guess the
    next block yi1, for any polynomial p(n) and
    sufficiently large n
  • ProbA(y1,y2,, yi) yi1 lt 1/p(n)
  • Exercise show how to convert a next-block
    unpredictable generator into a pseudo-random
    generator.

y1 y2, ,
95
Pseudo-Random Generatorsconcrete version
  • Gn?0,1?m ??0,1?n
  • A cryptographically strong pseudo-random sequence
    generator - if passes all polynomial time
    statistical tests
  • (t,?)-pseudo-random - no test A running in time t
    can distinguish with advantage ?

96
Three Basic issues in cryptography
  • Identification
  • Authentication
  • Encryption
  • Solve in a shared key environment

A
B
S
S
97
Identification - Remote login using
pseudo-random sequence
  • A and B share key S??0,1?k
  • In order for A to identify itself to B
  • Generate sequence Gn(S)
  • For each identification session - send next block
    of Gn(S)

Gn(S)
98
Problems...
  • More than two parties
  • Malicious adversaries - add noise
  • Coordinating the location block number
  • Better approach Challenge-Response

99
Challenge-Response Protocol
  • B selects a random location and sends to A
  • A sends value at random location

A
B
Whats this?
100
Desired Properties
  • Very long string - prevent repetitions
  • Random access to the sequence
  • Unpredictability - cannot guess the value at a
    random location
  • even after seeing values at many parts of the
    string to the adversarys choice.
  • Pseudo-randomness implies unpredictability
  • Not the other way around for blocks

101
Authenticating Messages
  • A wants to send message M??0,1?n to B
  • B should be confident that A is indeed the sender
    of M
  • One-time application
  • S (a,b) -
  • where a,b?R ?0,1?n
  • To authenticate M supply aM? b
  • Computation is done in GF2n

102
Problems and Solutions
  • Problems - same as for identification
  • If a very long random string available -
  • can use for one-time authentication
  • Works even if only random looking
  • a,b

A
B
Use this!
103
Encryption of Messages
  • A wants to send message M??0,1?n to B
  • only B should be able to learn M
  • One-time application
  • S a
  • where a?R ?0,1?n
  • To encrypt M
  • send a ? M

104
Encryption of Messages
  • If a very long random looking string available -
  • can use as in one-time encryption

A
B
Use this!
105
Pseudo-random Functions
  • Concrete Treatment
  • F ?0,1?k ? ?0,1?n ? ?0,1?m
  • key Domain
    Range
  • Denote Y FS (X)
  • A family of functions Fk FS S??0,1?k ? is
    (t, ?, q)-pseudo-random if it is
  • Efficiently computable - random access
  • and...

106
(t,?,q)-pseudo-random
  • The tester A that can choose adaptively
  • X1 and get Y1 FS (X1)
  • X2 and get Y2 FS (X2 )
  • Xq and get Yq FS (Xq)
  • Then A has to decide whether
  • FS ?R Fk or
  • FS ?R R n ? m ? F F ?0,1?n ? ?0,1?m ?

107
(t,?,q)-pseudo-random
  • For a function F chosen at random from
  • (1) Fk FS S??0,1?k ?
  • (2) R n ? m ? F F ?0,1?n ? ?0,1?m ?
  • For all t-time machines A that choose q
    locations and try to distinguish (1) from (2)
  • ? Prob?A? 1 ? F?R Fk ?
  • - Prob?A? 1 ? F?R R n ? m ? ? ? ?

108
Equivalent/Non-Equivalent Definitions
  • Instead of next bit test for X??X1,X2 ,?, Xq?
    chosen by A, decide whether given Y is
  • Y FS (X) or
  • Y?R?0,1?m
  • Adaptive vs. Non-adaptive
  • Unpredictability vs. pseudo-randomness
  • A pseudo-random sequence generator g?0,1?m
    ??0,1?n
  • a pseudo-random function on small domain ?0,1?log
    n??0,1? with key in ?0,1?m

109
Application to the basic issues in cryptography
  • Solution using a shared key S
  • Identification
  • B to A X ?R ?0,1?n
  • A to B Y FS (X)
  • A verifies
  • Authentication
  • A to B Y FS (M)
  • replay attack
  • Encryption
  • A chooses X?R ?0,1?n
  • A to B ltX , Y FS (X) ? M gt

110
Goal
  • Construct an ensemble Fk k?L ? such that
  • for any tk, 1/?k, qk k?L ? polynomial in k,
    for all but finitely many ks
  • Fk is a (tk, ?k, qk )-pseudo-random family

111
Construction
  • Construction via Expansion
  • Expand n or m
  • Direct constructions

112
Effects of Concatenation
  • Given l Functions F1 , F2 ,?, Fl decide whether
    they are
  • l random and independent functions
  • OR
  • FS1 , FS2 ,?, FSl for S1,S2 ,?, Sl ?R?0,1?k
  • Claim If Fk FS S??0,1?k ? is
    (t,?,q)-pseudo-random
  • cannot distinguish two cases
  • using q queries
  • in time tt - l?q
  • with advantage better than l??

113
Proof Hybrid Argument
  • i0 FS1 , FS2 ,?, FSl p0
  • i R1, R2 , ? , Ri-1,FSi , FSi1 ,?, FSl
    pi
  • il R1, R2 , ? , Rl
    pl
  • ? pl - p0 ?? ? ? ? i ?pi1 - pi ?? ?/l

114
...Hybrid Argument
  • Can use this i to distinguish whether
  • FS ?R Fk or FS ?R R n ? m
  • Generate FSi1 ,?, FSl
  • Answer queries to first i-1 functions at random
    (consistently)
  • Answer query to FSi , using (black box) input
  • Answer queries to functions i1 through l with
    FSi1 ,?, FSl
  • Running time of test - t ? l?q

115
Doubling the domain
  • Suppose F(n) ?0,1?k ? ?0,1?n ? ?0,1?m which
    is (t,?,q)-p.r.
  • Want F(n1) ?0,1?k ? ?0,1?n1 ? ?0,1?m
    which is (t,?,q)-p.r.
  • Use G ?0,1?k ? ?0,1?2k which is (t ,?) p.r
  • G(S) ? G0(S) G1(S)
  • Let FS (n1)(bx) ? FGb(s) (n)(x)

116
Claim
  • If G is (t?q,?1)-p.r and F(n) is (t?2q,?2,q)-p.r,
    then F(n1) is (t,?1 ?2 ?2,q)-p.r
  • Proof three distributions
  • (1) F(n1)
  • (2) FS0 (n) , FS1 (n) for independent S0, S1
  • (3) Random

D? ?1 ?2 ?2
117
...Proof
  • Given that (1) and (3) can be distinguished with
    advantage ?1 ?2 ?2 , then either
  • (1) and (2) with advantage ?1
  • G can be distinguished with advantage ?1
  • or
  • (2) and (3) with advantage 2 ?2
  • F(n) can be distinguished with advantage ?2
  • Running time of test - t ? q

118
Getting from G to F(n)
  • Idea Use recursive construction
  • FS (n)(bnbn-1 ?b1)
  • ? FGb1(s) (n-1)(bn-1bn-2 ?b1)
  • ? Gbn(Gbn-1 ( ? Gb1(S)) ?)
  • Each evaluation of FS (n)(x) n invocations of G

119
Tree Description
S
G1(S)
G0(S)
G0(G0(S))
Each leaf corresponds to an X. Label on leaf
value of pseudo-random function
G1(G0(G0(S)))
120
Security claim
  • If G is (t ?qn ,?) p.r,
  • then F(n) is (t, ? ? n?q??,q) p.r
  • Proof Hybrid argument by levels
  • Di
  • truly random labels for nodes at level i.
  • Pseudo-random from i down
  • Each Di - a collection of q functions
  • ? i ?pi1 - pi ?? ?/n? q??

121
Hybrid
?S
i
S1
S0
Di
G0(S0)
n-i
G1(G0(S0))
122
Proof of Security
  • Can use this i to distinguish concatenation of q
    sequence generators G from random.
  • The concatenation is (t,q?) p.r
  • Therefore the construction is (t,?,q) p.r

123
Disadvantages
  • Expensive - n invocations of G
  • Sequential
  • Deterioration of ?
  • But does the job!
  • From any pseudo-random sequence generator
    construct a pseudo-random function.
  • Theorem one-way functions exist if and only if
    pseud-random functions exist.

124
Applications of Pseudo-random Functions
  • Learning Theory - lower bounds
  • Cannot PAC learn any class containing
    pseudo-random function
  • Complexity Theory - impossibility of natural
    proofs for separating classes.
  • Any setting where huge shared random string is
    useful
  • Caveat what happens when the seed is made
    public?

125
References
  • Blum-Micali SIAM J. Computing 1984
  • Yao
  • Blum, Blum, Shub SIAM J. Computing, 1988
  • Goldreich, Goldwasser and Micali J. of the ACM,
    1986

126
...References
  • Books
  • O. Goldreich, Foundations of Cryptography - a
    book in three volumes.
  • Vol 1, Basic Tools, Cambridge, 2001
  • Pseudo-randomness, zero-knowledge
  • Vol 2, about to come out
  • (Encryption, Secure Function Evaluation)
  • Other volumes in www.wisdom.weizmann.ac.il/oded/b
    ooks.html
  • M. Luby, Pseudorandomness and Cryptographic
    Applications, Princeton University Pres
  • ,

127
References
  • Web material/courses
  • S. Goldwasser and M. Bellare, Lecture Notes on
    Cryptography,
  • http//www-cse.ucsd.edu/mihir/papers/gb.html
  • Wagner/Trevisan, Berkeley
  • www.cs.berkeley.edu/daw/cs276
  • Ivan Damgard and Ronald Cramer, Cryptologic
    Protocol Theory
  • http//www.daimi.au.dk/ivan/CPT.html
  • Salil Vadhan, Pseudorandomness
  • http//www.courses.fas.harvard.edu/cs225/Lectures
    -2002/
  • Naor, Foundations of Cryptography and Estonian
    Course
  • www.wisdom.weizmann.ac.il/naor
Write a Comment
User Comments (0)
About PowerShow.com