Privacy and Cyberspace - PowerPoint PPT Presentation


PPT – Privacy and Cyberspace PowerPoint presentation | free to download - id: 2674b-MmE2N


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation

Privacy and Cyberspace


... that information onto a user's PC (without informing the user), violate privacy. ... of having broken the law (welfare cheats, deadbeat parents, etc. ... – PowerPoint PPT presentation

Number of Views:512
Avg rating:3.0/5.0
Slides: 53
Provided by: msciMe


Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Privacy and Cyberspace

Privacy and Cyberspace
  • Are privacy issues unique to cybertechnology?
  • Four characteristics worth noting
  • The amount of personal information that can be
    gathered using cybertechnology.
  • The speed at which personal information can be
    transmitted using cybertechnology.
  • The duration of time in which the information can
    be retained because of cybertechnology.
  • The kind of information that can now be
    transferred because of cybertechnology.

What is Personal Privacy
  • Privacy is a concept that is neither clearly
    understood nor easily defined.
  • Sometimes we speak of ones privacy as something
    that has been
  • "lost,"
  • "diminished,"
  • "intruded upon,"
  • "invaded,"
  • "violated,"
  • "breached," and so forth.

What is Privacy (continued)?
  • Privacy is sometimes viewed as an
    "all-or-nothing" concept that is, something
    that one either has (totally) or does not have.
  • At other times, privacy is viewed as something
    that can be diminished.
  • For example, as a repository of personal
    information that can be eroded gradually.

Table 5-1 Three Theories of Privacy
A Comprehensive Account of Privacy
  • Moor (1997) has introduced a theory of privacy
    that incorporates important elements of the
    non-intrusion, non-interference, and
    informational views of privacy.
  • According to Moor
  • an individual has privacy in a situation if in
    that particular situation the individual is
    protected from intrusion, interference, and
    information access by others. Italics Added

Moors Theory of Privacy (continued)
  • An important aspect in this definition is Moor's
    notion of a situation.
  • A situation is left deliberately broad so that it
    can apply to a range of contexts or "zones.
  • Situations can be "declared private" in a
    normative sense.
  • For example, a situation can be an "activity," a
    "relationship," or the "storage and access of
    information" in a computer or on the Internet.

Moors Privacy Theory (continued)
  • Moors distinction between naturally private and
    normatively private situations enables us to
    differentiate between the conditions required
  • (a) having privacy (in a descriptive sense)
  • (b) having a right to privacy.
  • With this distinction we can differentiate
    between a
  • loss of privacy
  • violation of privacy.

Two Scenarios
  • Scenario 1 Someone walks into the computer lab
    and sees you using a computer.
  • Your privacy is lost but not violated.
  • Scenario 2 Someone peeps through the keyhole of
    your apartment door and sees you using a
  • Your privacy is not only lost but is violated.

Why is Privacy Important?
  • What kind of value is privacy?
  • Is it one that is universally valued?
  • Is privacy valued mainly in Western
    industrialized societies, where greater
    importance is placed on individuals?
  • Is privacy something that is valued for its own
    sake i.e., an intrinsic value?
  • Is it valued as a means to an end, in which case
    it has only instrumental worth?

Privacy as a Universal Value
  • Not valued the same in all cultures.
  • Less valued in non-Western nations and in rural
  • Less valued in some democratic societies (such as
    Israel) where security and safety are important.
  • Has at least some value in all societies.

Is Privacy an Intrinsic or Instrumental Value?
  • Not valued for its own sake.
  • But is more than an instrumental value in the
    sense that it is necessary (rather than merely
    contingent) for achieving important human ends.
  • Fried privacy is necessary for human ends such
    as trust and friendship.
  • Moor privacy is an expression of the core value

Privacy as an Important Social Value
  • Privacy is important for a diversity of
    relationships (from intimate to casual).
  • It is important for democracy.
  • Privacy is an important social, as well as an
    individual, value.
  • Regan (1995) we need to understand the
    importance of privacy as a social value.

Three Ways Privacy is Threat- ened by
  • (A) data-gathering techniques used to collect and
    record personal information, often without the
    knowledge and consent of users.
  • (B) data-exchanging techniques used to transfer
    and exchange personal data across and between
    computer databases, typically without the
    knowledge and consent of users.
  • (C) data-mining techniques used to search for
    patterns implicit in large databases in order to
    generate consumer profiles based on behavioral
    patterns discovered in certain groups.

Gathering Personal Data
  • Personal data has been gathered since Roman times
    (census data).
  • Dataveillance a term coined by Roger Clarke
    to capture two techniques made possible by
    computer technology
  • (a) the surveillance (data-monitoring)
  • (b) data-recording.

Dataveillance (Continued)
  • Video cameras monitor an individual's physical
    movements when they shop at certain department
  • Some motorists are now subject to new schemes of
    highway surveillance while driving in their motor
    vehicles, because of new forms of scanning
    devices such as E-ZPASS.
  • Even the number of "clickstreams" key strokes
    and mouse clicks entered by a Web site visitor
    can be monitored and recorded.

Internet Cookies
  • Cookies are files that Web sites send to and
    retrieve from the computer systems of Web users.
  • Cookies technology enables Web site owners to
    collect certain kinds of data about the users who
    access their sites.
  • Because of "cookies technology," information
    about an individual's on-line browsing
    preferences can be "captured" whenever a person
    visits a Web site.

Cookies (Continued)
  • The data recorded (via cookies) about the user is
    then stored on a file placed on the hard drive of
    the user's computer system.
  • No other data-gathering mechanism actually stores
    the data it collects on the users computer.
  • The information can then be retrieved from the
    user's system and resubmitted to a Web site the
    next time the user accesses that site.
  • The exchange of data typically occurs without a
    user's knowledge and consent.

Can Cookies be Defended?
  • Web sites that use cookies maintain that they are
    performing a service for repeat users of a Web
    site by customizing a user's means of information
  • They also point out that, because of cookies,
    they are able to provide a user with a list of
    preferences for future visits to that Web site.

Arguments Against Cookies
  • Privacy advocates argue that activities involving
    the monitoring and recording an individual's
    activities while visiting a Web site and the
    subsequent downloading of that information onto a
    user's PC (without informing the user), violate
  • They also point out that information gathered
    about a user via cookies can eventually be
    acquired by on-line advertising agencies, who
    could then target that user for on-line ads.

Computerized Merging and Matching Operations
  • Computer merging is a technique of extracting
    information from two or more unrelated databases,
    which contain data about some individual or group
    of individuals, and incorporating it into a
    composite file.
  • Computer merging occurs whenever two or more
    disparate pieces of information contained in
    separate databases are combined.

Computer Merging
  • Consider a scenario in which you voluntarily give
    information about yourself to three different
  • First, you give information about your income and
    credit history to a lending institution in order
    to secure a loan.
  • You next give information about your age and
    medical history to an insurance company to
    purchase life insurance.
  • You then give information about your views on
    certain social issues to a political organization
    you wish to join.

Computer Merging (continued)
  • Each organization has a legitimate need for
    information to make decisions about you.
  • Insurance companies have a legitimate need to
    know about your age and medical history before
    agreeing to sell you life insurance.
  • Lending institutions have a legitimate need to
    know information about your income and credit
    history before agreeing to lend you money to
    purchase a house or a car.

Computer Merging (continued)
  • Suppose that, without your knowledge and consent,
    information about you contained in the insurance
    company's database is merged with information
    about you that resided in the lending
    institution's database or in the political
    organization's database.
  • You voluntarily gave certain information about
    yourself to three different organizations.
  • You authorized each organization to have the
    specific information you voluntary granted.
  • However, it does not follow that you thereby
    authorized any one organization to have some
    combination of that information.

Computer Merging (continued)
  • Case Illustration
  • Double-Click, an on-line advertising company
    attempted to purchase Abacus, Inc. an off-line
    database company.
  • Double-Click would have been able to merge
    on-line and off-line records.

Computer Matching
  • Computer matching is a technique that involves
    the cross checking of information in two or more
    databases that are typically unrelated in order
    to produces certain "matching records" or "hits."
  • Matching or cross-referencing records in two or
    more databases in order to generate one or more
    hits is used for the express purpose of creating
    a new file, which typically contains a list of
    potential law violators.

Computer Matching (continued)
  • In federal and state government applications,
    computerized matching has been used by various
    agencies and departments to identify
  • potential law violators
  • individuals who have actually broken the law or
    who are suspected of having broken the law
    (welfare cheats, deadbeat parents, etc.).

Computer Matching (continued)
  • A scenario could be federal income tax records
    matched against state motor vehicle registration
    (looking for low income and expensive
  • Consider an analogy in physical space in which
    your mail in monitored and secretly matched or
    opened by authorities.

Computer Matching (continued)
  • Those who defend matching argue
  • If you have nothing to hide, you have nothing to
    worry about.
  • Another argument is
  • Privacy is a legal right.
  • Legal rights are not absolute.
  • When one violates the law (i.e., commits a
    crime), one forfeits one's legal rights.
  • Therefore, criminals have forfeited their right
    to privacy.

Computer Matching (continued)
  • Case illustration involving biometrics
  • At Super Bowl XXXV in January 2001, a
    facial-recognition technology was used to scan
    the faces of individuals entering the stadium.
  • The digitized facial images were then instantly
    matched against images contained in a centralized
    database of suspected criminals and terrorists.
  • This practice was, at the time, criticized by
    many civil-liberties proponents.

Data Mining
  • Data mining involves the indirect gathering of
    personal information through an analysis of
    implicit patterns discoverable in data.
  • Data-mining activities can generate new and
    sometimes non-obvious classifications or
  • Individuals whose data is mined could become
    identified with or linked to certain newly
    created groups that they might never have
    imagined to exist.

Data Mining (Continued)
  • Current privacy laws offer individuals no
    protection regarding information about them that
    is acquired through data-mining activities is
    subsequently used.
  • Important decisions can be made about those
    individuals based on the patterns found in the
    mined personal data.
  • So some uses of data-mining technology raise
    special concerns for personal privacy.

Data Mining (Continued)
  • Unlike personal data that resides in explicit
    records in databases, information acquired about
    persons via data mining is often derived from
    implicit patterns in the data.
  • The patterns can suggest "new" facts,
    relationships, or associations about that person,
    such as that person's membership in a newly
    "discovered" category or group.

Data Mining (Continued)
  • Much personal data collected and used in
    data-mining applications is generally considered
    to be neither confidential nor intimate in
  • So there is a tendency to presume that such data
    must by default be public data.

Data Mining (Continued)
  • Hypothetical Scenario (Lee)
  • Lee is a 35-year old junior executive
  • Lee applies for a car loan
  • Lee has an impeccable credit history
  • A data mining algorithm discovers that Lee
    belongs to a group of individuals likely to start
    their own business and declare bankruptcy
  • Lee is denied the loan based on data mining.

Techniques for Manipulating Personal Data
Data Mining on the Internet
  • Traditionally, data mining is done in large data
    warehouses (off-line).
  • "Intelligent agents" or "softbots" acting on
    behalf of human beings sift through and analyze
    the mounds of data on the Internet.
  • Metasearch engines "crawl" through the Web in
    order to uncover general patterns from
    information retrieved from search-engine requests
    across multiple Web sites.

The Problem of Protecting Privacy in Public
  • Non-Public Personal Information (or NPI) refers
    to sensitive information such as in ones
    financial and medical records.
  • NPI has some legal protection
  • Many privacy analysts are now concerned about a
    different kind of personal information Public
    Personal Information (or PPI).
  • PPI is non-confidential and non-intimate in
    character is also being mined.

  • Why should the collection of PPI, which is
    publicly available information about persons
    generate controversies involving privacy?
  • it might seem that there is little to worry
  • For example, suppose learns that that you are a
    student at Rivier, you frequently attend college
    basketball games, and you are actively involved
    in Riviers computer science club.
  • In one sense, the information is personal because
    it is about you (as a person)but it is also
    about what you do in the public sphere.

PPI (Continued)
  • In the past, it would have been difficult to make
    a strong case for such legislation protecting
    PPI, because lawmakers and ordinary persons would
    have seen no need to protect that kind of
    personal information.
  • Nissenbaum (1997) believes that our earlier
    assumptions about the need to protect privacy in
    public are no longer tenable because of a
    misleading assumption 
  • There is a realm of public information about
    persons to which no privacy norms apply.

PPI (Continued)
  • Hypothetical Scenario
  • (a) Shopping at Supermart
  • (b) Shopping at
  • Reveal problems of protecting privacy in public
    in an era of information technology and data

Search Engines and Personal Information
  • Search facilities can be used to gain personal
    information about individuals (e.g., the Amy
    Boyer example).
  • Your Web activities can be catalogued (Deja News)
    and referenced by search engines.
  • Scenario using a search engine to locate a

Accessing Public Records via the Internet
  • What are public records?
  • Why do we have them?
  • Traditionally, they were accessed via hardcopy
    documents that resided in municipal buildings.
  • Recall the Amy Boyer case.
  • Would it have made a difference?

Accessing Public Records via the Internet
  • Some information merchants believe that because
    public records are, by definition, "public," they
    must be made available online.
  • They reason
  • Public records have always been available to the
  • Public records have always resided in public
  • The Internet is a public space.
  • Therefore, all of public records ought to be made
    available on-line.

Accessing Public Records via the Internet
  • Two Case illustrations
  • State of Oregon (Motor Vehicle Department)
  • Merrimack, NH (tax records for city residents).

Can Technology Be Used to Protect Personal
  • Privacy advocates have typically argued for
    stronger privacy laws to protect individuals.
  • Groups representing the e-commerce sector have
    lobbied for voluntary controls and industry
    self-regulation as an alternative to additional
    privacy legislation.
  • Now, some members of each camp support a
    compromise resolution to the on-line privacy
    debate in the form of privacy-enhancing tools or

  • PETs can be understood as tools that users can
    employ either to
  • (a) protect their personal identity while
    interacting with the Web
  • (b) protect the privacy of communications (such
    as e-mail) sent over the Internet.

PETs (Continued)
  • Three Problems with PETs
  • (1) Educating Users About the Existence of PETS
  • (2) The Principle of Informed Consent
  • (3) Issues of Social Equity.

Educating Users About PETs
  • How are Users supposed to find about PETs?
  • DeCew (1997) there should be a presumption in
    favor of privacy for indiciduals who can then
  • With PETs, the default is that users must
    discover their existence and learn how to use

PETS and the Problem of Informed Consent
  • Users enter into an agreement with Web site
    owners (if they have a privacy policy).
  • They typically have to opt out of having
    information collected. (The default practice is
    that they have opted in, unlesss they specify
  • Policies involving PETs cant guarantee users
    against secondary and future uses of their
    information (e.g., the Toysmart case).

PETS and Social Equity
  • DeCew principle of dynamic negotiation.
  • Poorer users have fewer options (and some may
    need to sell their personal information).
  • Two classes privacy rich/privacy poor.
  • Analogy Poor people in third world countries
    selling organs for money.

Privacy Legislation and Industry Self-Regulation
  • Can industry regulate privacy with government
    regulation and privacy legislation?
  • Toysmart case
  • Privacy laws and Data protection Principles
  • EU Directive
  • US (a patchwork of laws).

Comprehensive Privacy Proposals
  • Clark argues for a "co-regulatory" model.
  • He believes that a successful on-line-privacy
    policy must include
  • strong legislation
  • a privacy oversight commission
  • industry self-regulation.
  • These must also be accompanied by
    privacy-enhancing technologies.
  • A "privacy watchdog agency" and sanctions are
    also both needed.