Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1Way FN for - PowerPoint PPT Presentation

About This Presentation
Title:

Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1Way FN for

Description:

1-Way FN -- formal definition. Private-key Encryption, revisited ... (no 'security from obscurity') What do we have as secrets? ( randomness! ... – PowerPoint PPT presentation

Number of Views:36
Avg rating:3.0/5.0
Slides: 44
Provided by: rafailos
Learn more at: http://web.cs.ucla.edu
Category:

less

Transcript and Presenter's Notes

Title: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1Way FN for


1
Perfect and Statistical Secrecy,probabilistic
algorithms, Definitions of Easy and Hard,1-Way
FN -- formal definition

2
Private-key Encryption, revisited
  • Key generation alg tosses coins ? key k
  • ENCk(m,r) ?c
  • DECk(c) ? m
  • Correctness for all k, r, m, DECk(ENCk(m,r))m

3
Perfect secrecy, revisited
Encryption scheme is perfectly secure if for
every m1 and m2 and every c,
Where prob is over key k from key-generation, and
the coin-tosses of ENC.
4
Shannons secrecy
Let D be a distribution on M. An encryption
scheme satisfies Shannons secrecy with respect
to D if for every m in D and every c,
Where the probability is taken over D, key-gen,
and coin-tosses of Enc.
5
Claim perfect secrecy holds if and only if
Shannons secrecy holds
We prove perfect secrecy implies Shannon
Secrecy, the converse is similar. (prove it!)
6
Proof
Proof recall Bayess law let E and F be events
7
Proof(cont)
So, we need to show that
In other words we need to show that
8
Proof(cont)
(by perfect Secrecy) ?
QED.
9
CLAIM If an encryption scheme is perfectly
secure, then the number of keys is at least the
size of the message space.
Proof consider a cyphrertext c for some msg m.
By perfect secrecy, c could be an encr. of every
message. That is, for every m in D, there exists
a k that encrypts/decrypts c to m. Thus, c is
decryptable to any message, using different keys.
But the number of decryptions of c is at most the
number of keys.
10
This is bad news we need to relax the notion of
secrecy to be able to use smaller keys
  • How do we relax the notion?

11
A first attempt
  • We will relax the notion of perfect secrecy.
  • What about statistical security?

Let X an Y be random variables over a finite set
S. X and Y are called -close if for very event
T is called a statistical test.
12
Statistical Secrecy
  • DEFINITION Encryption scheme is statistically
    -secure if for every two messages m1, and m2 the
    random variables (over the choice of key)
    ENCk(m1) and ENCk(m2) are -close.
  • Still can not go much below Shannons bound
  • HW show this!

13
Computational assumptions
  • Probabilistic TM definition (additional RANDOM
    tape which is read-once)
  • P languages that can be decided in time nc for
    some c
  • NP languages that can be verified (i.e. given a
    poly-size witness w we can in time nc check.
    Thus, there exists a relation R(x,w) is in P.

14
For us, every algorithm is public
  • (no security from obscurity)
  • What do we have as secrets? (randomness!)
  • The need to define PPT (probabilistic polynomial
    time)

15
How do we deal with coin-toss?
  • RP (randomized poly) languages recognizable
    in polynomial time with 1-sided error

16
Dealing with RP
WHY? First part Can ignore coins Second part
Can guess the correct coins
17
Dealing with RP amplification
Suppose we are given machine M s.t.
18
How do we amplify?
Design new machine M that runs machine M k times
with new coin-flips each time. M accepts iff any
run of M accepts, otherwise M rejects.
What is the probability of acceptance of M for
x in L?
19
Computing prob of succ of M
M errors on
Recall that
So, make k sufficiently big!
20
2-sided error
  • BPP (Bounded probabilistic poly-time)
    languages recognizable in polynomial time with
    2-sided error

21
BPP Amplification
We wish to design new machine M s.t.
22
What do we do?
  • Machine M runs machine M many times with
    independent coin-flips each time, and takes a
    MAJORITY vote.
  • If M runs M k times, what is the probability
    that M makes a mistake?

23
Chernoff Bound
Let X1 Xm be independent 0,1 random variables
with the same probability distribution. Then
24
Back to BPP
Let Xi1 if M makes a mistake on the ith run.
(by def of BPP)
Lets be pessimistic and make it equal to 1/3
25
Plugging in
If we make a majority vote, we fail if
26
What do we know?
What about BPP vs NP?
27
Non-uniformity, called P/poly
  • Sometimes, we need to talk about different input
    length (circuits, or a TM with advice for every
    input length)
  • SOME QUESTOIONS
  • Does it belong to NP?
  • How easy is it?
  • Does it need coin-flips?

28
P/poly
  • does P/poly belong to NP?
  • guess advice which makes x accept, if there is
    such an advice accept.
  • Why does it not work?

29
P/poly (cont)
  • How powerful is this class?
  • It includes undecidable languages.
  • Consider some standard enumeration of TM. Define
    poly advice for each input length to encode
    decidability of the ith machine.

30
Do we need randomness for P/poly?
Adellman
31
Proof of Adellmans thm.
We assume BPP machine M s.t.
32
Proof that we dont need coins
We want to show that there always exists a random
string r(n) that works for all strings of length
n. How do we show it???
33
Proof (cont)
M uses r(n) random bits
1 ? makes a mistake
0 ? correct
If there is a column which is good for all x we
are done!
34
Proof the punch line
Number of 1s
(number of rows) (number of 1s per row)
Thus, the average number of 1s per column
Thus, some column must have all 0s.
35
Terminology
  • For us, easy means polynomial time with
    coin-flips
  • Fast ? poly-time
  • Probably fast ? expected poly-time

36
Terminology (cont)
  • Las Vegas ? always correct, probably fast
  • Monte-carlo ? always fast, probably correct
    1-sided (i.e. RP or co-RP)
  • Atlantic City ? always fast, probably correct
    2-sided (i.e. BPP)

37
Back to Hardness
  • For us, BPP is easy.
  • But in BPP we can amplify decision of any
    language that is bounded away from half by
    1/poly
  • So we must define hard languages that can not
    be decided even with 1/poly advantage.
  • So, how do we define less then 1/poly?

38
Examples of less then 1/poly
39
Try 1 negligible function
40
Try 1
Not good. Consider
41
Try 2
  • After n gets sufficiently big, it works!

42
Formally, now
43
Some facts
  • If g(n) is negligible then g(n)poly(n) is still
    negligible! (HW prove it!)
  • We sometimes call any gt 1/poly function
    noticeable (since we can amplify)
  • There are functions that are NEITHER negligible
    NOR noticeable. Then we need to deal with
    families of input length, and typically resort
    to non-uniform proofs.
Write a Comment
User Comments (0)
About PowerShow.com