Title: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1Way FN for
1Perfect and Statistical Secrecy,probabilistic
algorithms, Definitions of Easy and Hard,1-Way
FN -- formal definition
2Private-key Encryption, revisited
- Key generation alg tosses coins ? key k
- ENCk(m,r) ?c
- DECk(c) ? m
- Correctness for all k, r, m, DECk(ENCk(m,r))m
3Perfect secrecy, revisited
Encryption scheme is perfectly secure if for
every m1 and m2 and every c,
Where prob is over key k from key-generation, and
the coin-tosses of ENC.
4Shannons secrecy
Let D be a distribution on M. An encryption
scheme satisfies Shannons secrecy with respect
to D if for every m in D and every c,
Where the probability is taken over D, key-gen,
and coin-tosses of Enc.
5Claim perfect secrecy holds if and only if
Shannons secrecy holds
We prove perfect secrecy implies Shannon
Secrecy, the converse is similar. (prove it!)
6Proof
Proof recall Bayess law let E and F be events
7Proof(cont)
So, we need to show that
In other words we need to show that
8Proof(cont)
(by perfect Secrecy) ?
QED.
9CLAIM If an encryption scheme is perfectly
secure, then the number of keys is at least the
size of the message space.
Proof consider a cyphrertext c for some msg m.
By perfect secrecy, c could be an encr. of every
message. That is, for every m in D, there exists
a k that encrypts/decrypts c to m. Thus, c is
decryptable to any message, using different keys.
But the number of decryptions of c is at most the
number of keys.
10This is bad news we need to relax the notion of
secrecy to be able to use smaller keys
- How do we relax the notion?
11A first attempt
- We will relax the notion of perfect secrecy.
- What about statistical security?
Let X an Y be random variables over a finite set
S. X and Y are called -close if for very event
T is called a statistical test.
12Statistical Secrecy
- DEFINITION Encryption scheme is statistically
-secure if for every two messages m1, and m2 the
random variables (over the choice of key)
ENCk(m1) and ENCk(m2) are -close.
- Still can not go much below Shannons bound
- HW show this!
13Computational assumptions
- Probabilistic TM definition (additional RANDOM
tape which is read-once) - P languages that can be decided in time nc for
some c - NP languages that can be verified (i.e. given a
poly-size witness w we can in time nc check.
Thus, there exists a relation R(x,w) is in P.
14For us, every algorithm is public
- (no security from obscurity)
- What do we have as secrets? (randomness!)
- The need to define PPT (probabilistic polynomial
time)
15How do we deal with coin-toss?
- RP (randomized poly) languages recognizable
in polynomial time with 1-sided error
16Dealing with RP
WHY? First part Can ignore coins Second part
Can guess the correct coins
17Dealing with RP amplification
Suppose we are given machine M s.t.
18How do we amplify?
Design new machine M that runs machine M k times
with new coin-flips each time. M accepts iff any
run of M accepts, otherwise M rejects.
What is the probability of acceptance of M for
x in L?
19Computing prob of succ of M
M errors on
Recall that
So, make k sufficiently big!
202-sided error
- BPP (Bounded probabilistic poly-time)
languages recognizable in polynomial time with
2-sided error
21BPP Amplification
We wish to design new machine M s.t.
22What do we do?
- Machine M runs machine M many times with
independent coin-flips each time, and takes a
MAJORITY vote. - If M runs M k times, what is the probability
that M makes a mistake?
23Chernoff Bound
Let X1 Xm be independent 0,1 random variables
with the same probability distribution. Then
24Back to BPP
Let Xi1 if M makes a mistake on the ith run.
(by def of BPP)
Lets be pessimistic and make it equal to 1/3
25Plugging in
If we make a majority vote, we fail if
26What do we know?
What about BPP vs NP?
27Non-uniformity, called P/poly
- Sometimes, we need to talk about different input
length (circuits, or a TM with advice for every
input length) - SOME QUESTOIONS
- Does it belong to NP?
- How easy is it?
- Does it need coin-flips?
28 P/poly
- does P/poly belong to NP?
- guess advice which makes x accept, if there is
such an advice accept. - Why does it not work?
29P/poly (cont)
- How powerful is this class?
- It includes undecidable languages.
- Consider some standard enumeration of TM. Define
poly advice for each input length to encode
decidability of the ith machine.
30Do we need randomness for P/poly?
Adellman
31Proof of Adellmans thm.
We assume BPP machine M s.t.
32Proof that we dont need coins
We want to show that there always exists a random
string r(n) that works for all strings of length
n. How do we show it???
33Proof (cont)
M uses r(n) random bits
1 ? makes a mistake
0 ? correct
If there is a column which is good for all x we
are done!
34Proof the punch line
Number of 1s
(number of rows) (number of 1s per row)
Thus, the average number of 1s per column
Thus, some column must have all 0s.
35Terminology
- For us, easy means polynomial time with
coin-flips - Fast ? poly-time
- Probably fast ? expected poly-time
36Terminology (cont)
- Las Vegas ? always correct, probably fast
- Monte-carlo ? always fast, probably correct
1-sided (i.e. RP or co-RP) - Atlantic City ? always fast, probably correct
2-sided (i.e. BPP)
37Back to Hardness
- For us, BPP is easy.
- But in BPP we can amplify decision of any
language that is bounded away from half by
1/poly - So we must define hard languages that can not
be decided even with 1/poly advantage. - So, how do we define less then 1/poly?
38Examples of less then 1/poly
39Try 1 negligible function
40Try 1
Not good. Consider
41Try 2
- After n gets sufficiently big, it works!
42Formally, now
43Some facts
- If g(n) is negligible then g(n)poly(n) is still
negligible! (HW prove it!) - We sometimes call any gt 1/poly function
noticeable (since we can amplify) - There are functions that are NEITHER negligible
NOR noticeable. Then we need to deal with
families of input length, and typically resort
to non-uniform proofs.