6' Trust Negotiations and Trading Privacy for Trust - PowerPoint PPT Presentation

About This Presentation
Title:

6' Trust Negotiations and Trading Privacy for Trust

Description:

Threats of privacy violations result in lower trust ... b) Privacy-preserving symmetric trust negotiations ... trades a (degree of) privacy loss for (a degree ... – PowerPoint PPT presentation

Number of Views:101
Avg rating:3.0/5.0
Slides: 46
Provided by: llil
Category:

less

Transcript and Presenter's Notes

Title: 6' Trust Negotiations and Trading Privacy for Trust


1
6. Trust NegotiationsandTrading Privacy for
Trust
  • Presented by
  • Prof. Bharat Bhargava
  • Department of Computer Sciences and
  • Center for Education and Research in Information
    Assurance and Security (CERIAS)
  • Purdue University
  • with contributions from
  • Prof. Leszek Lilien
  • Western Michigan University and
  • CERIAS, Purdue University
  • Supported in part by NSF grants IIS-0209059,
    IIS-0242840, ANI-0219110, and Cisco URP grant.

2
Trust Negotiations andTrading Privacy for Trust
  • Outline
  • 1) Introduction
  • 1.1) Privacy and Trust
  • 1.2) The Paradigm of Trust
  • Small World Phenomenon
  • 2) Trust Negotiations
  • 2.1) Symmetric Trust Negotiations
  • a) Privacy-revealing
  • b) Privacy-preserving
  • 2.2) Asymmetric Trust Negotiations
  • a) Weaker Building Trust in Stronger
  • b) Stronger Building Trust in Weaker
  • 2.3) Summary Trading Information for Trust in
    Symm. and Asymm. Trust Negotiations
  • 3) Trading Privacy Loss for Trust Gain
  • 3.1) Privacy-trust Tradeoff
  • 3.2) Proposed Approach
  • 3.3) PRETTY Prototype for Experimental Studies

3
Introduction (1)1.1) Privacy and Trust
  • Privacy Problem
  • Consider computer-based interactions
  • From a simple transaction to a complex
    collaboration
  • Interactions involve dissemination of private
    data
  • It is voluntary, pseudo-voluntary, or required
    by law
  • Threats of privacy violations result in lower
    trust
  • Lower trust leads to isolation and lack of
    collaboration
  • Trust must be established
  • Data provide quality an integrity
  • End-to-end communication sender authentication,
    message integrity
  • Network routing algorithms deal with malicious
    peers, intruders, security attacks

4
1) Introduction (2)1.2) The Paradigm of Trust
  • Trust a paradigm of security for open computing
    environments (such as the Web)
  • Replaces/enhances CIA (confid./integr./availab.)
    as one of means for achieving security
  • But not as one of the goals of of security (as
    CIA are)
  • Trust is a powerful paradigm
  • Well tested in social models of interaction and
    systems
  • Trust is pervasive
  • Constantly if often unconsciously applied in
    interactions between
  • people / businesses / institutions / animals
    (e.g. a guide dog) /
  • artifacts (sic! e.g. Can I rely on my car for
    this long trip?)
  • Able to simplify security solutions
  • By reducing complexity of interactions among
    human and artificial system components

5
Introduction (3)Small World Phenomenon
  • Small-world phenomenon Milgram,
    1967
  • Find chains of acquaintances linking any two
    randomly chosen people in the United States who
    do not know one another (remember the Erdös
    number?)
  • Result the average number of intermediate steps
    in a successful chain between 5 and 6 gt the
    six degrees of separation principle
  • Relevance to security research Capkun et al.,
    2002
  • A graph exhibits the small-world phenomenon if
    (roughly speaking) any two vertices in the graph
    are likely to be connected through a short
    sequence of intermediate vertices
  • Trust is useful due to its inherently
    incorporating the small-world phenomenon

6
2) Trust Negotiations
  • Trust negotiations
  • Establish mutual trust between interacting
    parties
  • Types of trust negotiations L. Lilien and B.
    Bhargava, 2006
  • 2.1) Symmetric trust negotiations
  • - partners of similar-strenght
  • Overwhelmingly popular in the literature
  • 2.2) Asymmetric trust negotiations
  • - a weaker and a stronger partner
  • Identified by us (as far as we know)

7
Symmetric and AsymmetricTrust Negotiations
  • 2.1) Symmetric trust negotiations
  • Two types
  • a) Symmetric privacy-revealing negotiations
  • Disclose certificates or policies to the
    partner
  • b) Symmetric privacy-preserving negotiations
  • Preserve privacy of certificates and policies
  • Examples
  • Individual to individual / most B2B / ...
  • 2.2) Asymmetric trust negotiations
  • Two types
  • a) Weaker Building Trust in Stronger
  • b) Stronger Building Trust in Weaker
  • Examples
  • Individual to institution / small business to
    large business / ...

8
2.1) Symmetric Trust Negotiations (1)
  • Two types of symmetric trust negotiations
  • a) Privacy-revealing
  • b) Privacy-preserving
  • a) Privacy-revealing symmetric trust negotiations
  • Both reveal certificates or policies to the other
    partner
  • Growth of trust
  • An initial degree of trust necessary
  • Must trust enough to reveal (some) certificates /
    policies right away
  • Stepwise trust growth in each other as more
    (possibly private) info about each other revealed
  • Proportional to the number of ceritficates
    revealed to each other
  • Eventually full mutual trust established (when
    negotiation succeeds)
  • Full for the task at hand

9
2.1) Symmetric Trust Negotiations (2)
  • b) Privacy-preserving symmetric trust
    negotiations
  • Both preserve privacy of certificates and
    policies
  • Growth of trust
  • Initial distrust
  • No one wants to reveal any info to the partner
  • No intermediate trust growth (no intermediate
    degrees of trust established)
  • Instead, jump from distrust to trust
  • Eventually full mutual trust established (when
    negotiation succeeds)
  • Full for the task at hand

10
2.2) Asymmetric Trust Negotiations
  • Weaker and Stronger build trust in each other
  • a) Weaker building trust in Stronger a priori
  • E.g., a customer looking for a mortgage loan
    first selects a reputable bank, only then starts
    negotiations
  • b) Stronger building trust in Weaker in real
    time
  • E.g., the bank asks the customer for a lot of
    private info (incl. personal income and tax data,
    ) to establish trust in her

11
2.2) Asymmetric Trust Negotiations (2)a) Weaker
Building Trust in Stronger
  • Means of building trust by Weaker in Stronger (a
    priori)
  • Ask around
  • Family, friends, co-workers,
  • Check partners history and stated philosophy
  • Accomplishments, failures and associated
    recoveries,
  • Mission, goals, policies (incl. privacy
    policies),
  • Observe partners behavior
  • Trustworthy or not, stable or not,
  • Problem Needs time for a fair judgment
  • Check reputation databases
  • Better Business Bureau, consumer advocacy groups,
  • Verify partners credentials
  • Certificates and awards, memberships in
    trust-building organizations (e.g., BBB),
  • Protect yourself against partners misbehavior
  • Trusted third-party, security deposit,
    prepayment, buying insurance,

12
2.2) Asymmetric Trust Negotiations (3)b)
Stronger Building Trust in Weaker
  • Means of building trust by Stronger in Weaker
    (in real time)
  • Ask partner for an anonymous payment for goods or
    services
  • Cash / Digital cash / Other
  • Ask partner for a non-anonymous payment for goods
    or services
  • Credit card / Travelers Checks / Other
  • Ask partner for specific private information
  • Checks partners credit history
  • Computer authorization subsystem observes
    partners behavior
  • Trustworthy or not, stable or not,
  • Problem Needs time for a fair judgment
  • Computerized trading system checks partners
    records in reputation databases
  • e-Bay, PayPal,
  • Computer system verifies partners digital
    credentials
  • Passwords, magnetic and chip cards, biometrics,
  • Business protects itself against partners
    misbehavior
  • Trusted third-party, security deposit,
    prepayment, buying insurance,
  • Note Above blue line anonymity preserved,
    below identity revealed

13
Trust Growth inAsymmetric Trust Negotiations
  • When/how can partners trust each other?
  • Initially, Weaker has a full trust into
    Stronger
  • Weaker must trust Stronger fully to be ready
    for revealing all private information required to
    gain Strongers full trust
  • Weaker trades a (degree of) privacy loss for (a
    degree of) a trust gain as perceived by Stronger
  • A next degree of privacy lost when a next
    certificate revealed to Stronger
  • Exception no privacy loss in anonymity-preserving
    example in Stronger Building Trust in Weaker
  • Eventually full trust of Stronger into Weaker
    established when negotiation completed

14
2.3) Summary Trading Information for Trust in
Symm. and Asymm. Negotiations
  • When/how can partners trust each other?
  • Symmetric disclosing
  • Initial degree of trust / stepwise trust growth /
    establishes full mutual trust
  • Trades private info for trust (degree of info
    privacy varies - 0 to 100)
  • Symmetric preserving (from distrust to
    trust)
  • Initial distrust / no stepwise trust growth /
    establishes full mutual trust
  • No trading of private info for trust (degree of
    info privacy varies - 0 to 100)
  • Asymmetric
  • Initial full trust of Weaker into Stronger and
    no trust of Stronger into Weaker / stepwise trust
    growth of Stronger / establishes full trust of
    Stronger into Weaker
  • Trades private info for trust (degree of info
    privacy varies - 0 to 100)

15
3) Trading Privacy Loss for Trust Gain
  • Were focusing on asymmetric trust negotiations
  • Trading privacy for trust
  • Approach to trading privacy for trust
  • Zhong and Bhargava, Purdue
  • Formalize the privacy-trust tradeoff problem
  • Estimate privacy loss due to disclosing a
    credential set
  • Estimate trust gain due to disclosing a
    credential set
  • Develop algorithms that minimize privacy loss for
    required trust gain
  • Bec. nobody likes loosing more privacy than
    necessary
  • More details available

16
Related Work
  • Automated trust negotiation (ATN) Yu,
    Winslett, and Seamons, 2003
  • Tradeoff between the length of the negotiation,
    the amount of information disclosed, and the
    computation effort
  • Trust-based decision making Wegella et al.
    2003
  • Trust lifecycle management, with considerations
    of both trust and risk assessments
  • Trading privacy for trust Seigneur and
    Jensen, 2004
  • Privacy as the linkability of pieces of evidence
    to a pseudonym measured by using nymity
  • Goldberg, thesis, 2000

17
Proposed Approach
  • Formulate the privacy-trust tradeoff problem
  • Estimate privacy loss due to disclosing a set of
    credentials
  • Estimate trust gain due to disclosing a set of
    credentials
  • Develop algorithms that minimize privacy loss for
    required trust gain

18
A. Formulate Tradeoff Problem
  • Set of private attributes that user wants to
    conceal
  • Set of credentials
  • Subset of revealed credentials R
  • Subset of unrevealed credentials U
  • Choose a subset of credentials NC from U such
    that
  • NC satisfies the requirements for trust building
  • PrivacyLoss(NCR) PrivacyLoss(R) is minimized

19
Steps B D of the Approach
  • Estimate privacy loss due to disclosing a set of
    credentials
  • Requires defining privacy metrics
  • Estimate trust gain due to disclosing a set of
    credentials
  • Requires defining trust metrics
  • Develop algorithms that minimize privacy loss for
    required trust gain
  • Includes prototyping and experimentation
  • -- Details in another lecture of the series --

20
3.3) PRETTY Prototypefor Experimental Studies
(4)
(1)
(2)
2c2
(3) User Role
2a
2b 2d
2c1
(ltnrgt) unconditional path ltnrgt conditional
path
TERA Trust-Enhanced Role Assignment
21
Information Flow in PRETTY
  • User application sends query to server
    application.
  • Server application sends user information to TERA
    server for trust evaluation and role assignment.
  • If a higher trust level is required for query,
    TERA server sends the request for more users
    credentials to privacy negotiator.
  • Based on servers privacy policies and the
    credential requirements, privacy negotiator
    interacts with users privacy negotiator to build
    a higher level of trust.
  • Trust gain and privacy loss evaluator selects
    credentials that will increase trust to the
    required level with the least privacy loss.
    Calculation considers credential requirements and
    credentials disclosed in previous interactions.
  • According to privacy policies and calculated
    privacy loss, users privacy negotiator decides
    whether or not to supply credentials to the
    server.
  • Once trust level meets the minimum requirements,
    appropriate roles are assigned to user for
    execution of his query.
  • Based on query results, users trust level and
    privacy polices, data disseminator determines
    (i) whether to distort data and if so to what
    degree, and (ii) what privacy enforcement
    metadata should be associated with it.

22
References
  • L. Lilien and B. Bhargava, A scheme for
    privacy-preserving data dissemination, IEEE
    Transactions on Systems, Man and Cybernetics,
    Part A Systems and Humans, Vol. 36(3), May 2006,
    pp. 503-506.
  • Bharat Bhargava, Leszek Lilien, Arnon Rosenthal,
    Marianne Winslett, Pervasive Trust, IEEE
    Intelligent Systems, Sept./Oct. 2004, pp.74-77
  • B. Bhargava and L. Lilien, Private and Trusted
    Collaborations, Secure Knowledge Management (SKM
    2004) A Workshop, 2004.
  • B. Bhargava, C. Farkas, L. Lilien and F. Makedon,
    Trust, Privacy, and Security. Summary of a
    Workshop Breakout Session at the National Science
    Foundation Information and Data Management (IDM)
    Workshop held in Seattle, Washington, September
    14 - 16, 2003, CERIAS Tech Report 2003-34,
    CERIAS, Purdue University, Nov. 2003.
  • http//www2.cs.washington.edu/nsf2003 or
  • https//www.cerias.purdue.edu/tools_and_resources
    /bibtex_archive/archive/2003-34.pdf
  • Internet Security Glossary, The Internet
    Society, Aug. 2004 www.faqs.org/rfcs/rfc2828.html
    .
  • Sensor Nation Special Report, IEEE Spectrum,
    vol. 41, no. 7, 2004.
  • R. Khare and A. Rifkin, Trust Management on the
    World Wide Web, First Monday, vol. 3, no. 6,
    1998 www.firstmonday.dk/issues/issue3_6/khare.
  • M. Richardson, R. Agrawal, and P. Domingos,Trust
    Management for the Semantic Web, Proc. 2nd Intl
    Semantic Web Conf., LNCS 2870, Springer-Verlag,
    2003, pp. 351368.
  • P. Schiegg et al., Supply Chain Management
    SystemsA Survey of the State of the Art,
    Collaborative Systems for Production Management
    Proc. 8th Intl Conf. Advances in Production
    Management Systems (APMS 2002), IFIP Conf. Proc.
    257, Kluwer, 2002.
  • N.C. Romano Jr. and J. Fjermestad, Electronic
    Commerce Customer Relationship Management A
    Research Agenda, Information Technology and
    Management, vol. 4, nos. 23, 2003, pp. 233258.

23
End
24
Using Entropy to Trade Privacy for Trust
  • Yuhui Zhong
  • Bharat Bhargava
  • zhong, bb_at_cs.purdue.edu
  • Department of Computer Sciences
  • Purdue University

This work is supported by NSF grant IIS-0209059
25
Problem motivation
  • Privacy and trust form an adversarial
    relationship
  • Internet users worry about revealing personal
    data. This fear held back 15 billion in online
    revenue in 2001
  • Users have to provide digital credentials that
    contain private information in order to build
    trust in open environments like Internet.
  • Research is needed to quantify the tradeoff
    between privacy and trust

26
Subproblems
  • How much privacy is lost by disclosing a piece of
    credential?
  • How much does a user benefit from having a higher
    level of trust?
  • How much privacy a user is willing to sacrifice
    for a certain amount of trust gain?

27
Proposed approach
  • Formulate the privacy-trust tradeoff problem
  • Design metrics and algorithms to evaluate the
    privacy loss. We consider
  • Information receiver
  • Information usage
  • Information disclosed in the past
  • Estimate trust gain due to disclosing a set of
    credentials
  • Develop mechanisms empowering users to trade
    trust for privacy.
  • Design prototype and conduct experimental study

28
Related work
  • Privacy Metrics
  • Anonymity set without accounting for probability
    distribution Reiter and Rubin, 99
  • Differential entropy to measure how well an
    attacker estimates an attribute value Agrawal
    and Aggarwal 01
  • Automated trust negotiation (ATN) Yu, Winslett,
    and Seamons, 03
  • Tradeoff between the length of the negotiation,
    the amount of information disclosed, and the
    computation effort
  • Trust-based decision making Wegella et al. 03
  • Trust lifecycle management, with considerations
    of both trust and risk assessments
  • Trading privacy for trust Seigneur and Jensen,
    04
  • Privacy as the linkability of pieces of evidence
    to a pseudonym measured by using nymity
    Goldberg, thesis, 00

29
Formulation of tradeoff problem (1)
  • Set of private attributes that user wants to
    conceal
  • Set of credentials
  • R(i) subset of credentials revealed to receiver
    i
  • U(i) credentials unrevealed to receiver i
  • Credential set with minimal privacy loss
  • A subset of credentials NC from U (i)
  • NC satisfies the requirements for trust building
  • PrivacyLoss(NC?R(i)) PrivacyLoss(R(i))) is
    minimized

30
Formulation of tradeoff problem (2)
  • Decision problem
  • Decide whether trade trust for privacy or not
  • Determine minimal privacy damage
  • Minimal privacy damage is a function of minimal
    privacy loss, information usage and
    trustworthiness of information receiver.
  • Compute trust gain
  • Trade privacy for trust if trust gain gt minimal
    privacy damage
  • Selection problem
  • Choose credential set with minimal privacy loss

31
Formulation of tradeoff problem (3)
  • Collusion among information receivers
  • Use a global version Rg instead of R(i)
  • Minimal privacy loss for multiple private
    attributes
  • nc1 better for attr1 but worse for attr2 than nc2
  • Weight vector w1, w2, , wm corresponds to the
    sensitivity of attributes
  • Salary is more sensitive than favorite TV show
  • Privacy loss can be evaluated using
  • The weighted sum of privacy loss for all
    attributes
  • The privacy loss for the attribute with the
    highest weight

32
Two types of privacy loss (1)
  • Query-independent privacy loss
  • User determines her private attributes
  • Query-independent loss characterizes how helpful
    provided credentials for an adversarial to
    determine the probability density or probability
    mass function of a private attribute.

33
Two types of privacy loss (2)
  • Query-dependent privacy loss
  • User determines a set of potential queries Q that
    she is reluctant to answer
  • Provided credentials reveal information of
    attribute set A. Q is a function of A.
  • Query-dependent loss characterizes how helpful
    provided credentials for an adversarial to
    determine the probability density or probability
    mass function of Q.

34
Observation 1
  • High query-independent loss does not necessarily
    imply high query-dependent loss
  • An abstract example

35
Observation 2
  • Privacy loss is affected by the order of
    disclosure
  • Example
  • Private attribute
  • age
  • Potential queries
  • (Q1) Is Alice an elementary school student?
  • (Q2) Is Alice older than 50 to join a silver
    insurance plan?
  • Credentials
  • (C1) Driver license
  • (C2) Purdue undergraduate student ID

36
Example (1)
37
Example (2)
  • C1 ? C2
  • Disclosing C1
  • low query-independent loss (wide range for age)
  • 100 loss for Query 1 (elem. school student)
  • low loss for Query 2 (silver plan)
  • Disclosing C2
  • high query-independent loss (narrow range for
    age)
  • zero loss for Query 1 (because privacy was lost
    by disclosing license)
  • high loss for Query 2 (not sure ? no - high
    probability
  • C2 ? C1
  • Disclosing C2
  • low query-independent loss (wide range for age)
  • 100 loss for Query 1 (elem. school student)
  • high loss for Query 2 (silver plan)
  • Disclosing C1
  • high query-independent loss (narrow range of age)
  • zero loss for Query 1 (because privacy was lost
    by disclosing ID)
  • zero loss for Query 2

38
Entropy-based privacy loss
  • Entropy measures the randomness, or uncertainty,
    in private data.
  • When an adversarial gains more information,
    entropy decreases
  • The difference shows how much information has
    been leaked
  • Conditional probability is needed for entropy
    evaluation
  • Bayesian networks, kernel density estimation or
    subjective estimation can be adopted

39
Estimation ofquery-independent privacy loss
  • Single attribute
  • Domain of attribute a v1, v2, , vk
  • Pi and Pi are probability mass function before
    and after disclosing NC given revealed credential
    set R.
  • Multiple attributes
  • Attribute set a1, a2 ,an with sensitivity
    vector w1, w2, , wn

40
Estimation ofquery-dependent privacy loss
  • Single query Q
  • Q is the function f of attribute set A
  • Domain of f (A) qv1, qv2, , qvk
  • Multiple queries
  • Query set q1, q2 ,qn with sensitivity vector
    w1, w2, , wn
  • Pri is the probability that qi is asked

41
Estimate privacy damage
  • Assume user provides one damage function
    dusage(PrivacyLoss) for each information usage
  • PrivacyDamage(PrivacyLoss, Usage, Receiver)
    Dmax(PrivacyLoss)(1-Trustreceiver)
    dusage(PrivacyLoss) Trustreceiver
  • Trustreceiver is a number ? 0,1 representing
    the trustworthy of information receiver
  • Dmax(PrivacyLoss) Max(dusage(PrivacyLoss) for
    all usage)

42
Estimate trust gain
  • Increasing trust level
  • Adopt research on trust establishment and
    management
  • Benefit function TB(trust_level)
  • Provided by service provider or derived from
    users utility function
  • Trust gain
  • TB(trust_levelnew) - TB(tust_levelprev)

43
PRETTY Prototype for Experimental Studies
(4)
(1)
(2)
2c2
(3) User Role
2a
2b 2d
2c1
(ltnrgt) unconditional path ltnrgt conditional
path
TERA Trust-Enhanced Role Assignment
44
Information flow for PRETTY
  • User application sends query to server
    application.
  • Server application sends user information to TERA
    server for trust evaluation and role assignment.
  • If a higher trust level is required for query,
    TERA server sends the request for more users
    credentials to privacy negotiator.
  • Based on servers privacy policies and the
    credential requirements, privacy negotiator
    interacts with users privacy negotiator to build
    a higher level of trust.
  • Trust gain and privacy loss evaluator selects
    credentials that will increase trust to the
    required level with the least privacy loss.
    Calculation considers credential requirements and
    credentials disclosed in previous interactions.
  • According to privacy policies and calculated
    privacy loss, users privacy negotiator decides
    whether or not to supply credentials to the
    server.
  • Once trust level meets the minimum requirements,
    appropriate roles are assigned to user for
    execution of his query.
  • Based on query results, users trust level and
    privacy polices, data disseminator determines
    (i) whether to distort data and if so to what
    degree, and (ii) what privacy enforcement
    metadata should be associated with it.

45
Conclusion
  • This research addresses the tradeoff issues
    between privacy and trust.
  • Tradeoff problems are formally defined.
  • An entropy-based approach is proposed to estimate
    privacy loss.
  • A prototype is under development for experimental
    study.
Write a Comment
User Comments (0)
About PowerShow.com