Agent Technology for eCommerce - PowerPoint PPT Presentation

Loading...

PPT – Agent Technology for eCommerce PowerPoint presentation | free to download - id: ae54d-ODA1M



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Agent Technology for eCommerce

Description:

... of trust through developing norms which guide, monitor and regulate behaviour ... Sellers' discriminatory behaviour towards buyers. Negative discrimination ... – PowerPoint PPT presentation

Number of Views:58
Avg rating:3.0/5.0
Slides: 41
Provided by: Fas3
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Agent Technology for eCommerce


1
Agent Technology for e-Commerce
  • Chapter 12 Trust, Security and Legal Issues
  • Maria Fasli
  • http//cswww.essex.ac.uk/staff/mfasli/ATe-Commerce
    .htm

2
Challenges
  • Despite the excitement and the immense potential
    of software agents there are serious concerns
    about the associated trust, security and legal
    issues
  • Users need to
  • Trust that agents do what they say they do
  • Be confident that their privacy is protected and
    that the security risks involved in entrusting
    agents to perform transactions on their behalf
    are minimized
  • Be assured that any legal issues relating to
    agents trading electronically are fully covered,
    as they are in traditional trading practices

3
Perceived risks
  • Agents represent their users in negotiating for
    contracts, transactions etc. and act on their
    behalf, hence a number of risks
  • Agents have to interact with other entities apart
    from the user (agents, humans, services), perhaps
    not trustworthy
  • Mobility increases risk hostile platform,
    attacks, etc.
  • An agent runs the risk that other entities will
    access, copy, or modify its code and data, either
    by mistake or design
  • Information can be stolen while on transit or in
    storage

4
  • The agents identity can be hijacked and misused
  • Nonrepudiation
  • An agent may disappear temporarily or permanently
    causing loss of revenue valuable results and
    data may be lost
  • Users are uncomfortable with the idea of software
    agents dealing with nonroutine or exceptional
    situations
  • Reputation

5
  • Using agent platforms also presents risks
  • The validity, reliability and trustworthiness of
    an agents code and data cannot be easily
    determined automatically
  • Malicious agents may succeed in migrating to a
    platform
  • Information on management and access policies may
    be altered
  • Denial of service attacks

6
Trust
  • If there is no risk, the question of trust does
    not arise
  • The act of delegation presupposes trust as it
    allows passing responsibility for a task to
    another entity (agent/human)
  • Personal trust subjective and formed by an
    individual based on beliefs, observations,
    reasoning, social stereotypes and past
    experiences
  • Develops following a positive experience, reduces
    otherwise
  • Different dispositions towards trust

7
  • Impersonal trust derived from information or
    experiences as reported by third parties
  • Trusted third parties
  • Rule-based trust (institutions)
  • Reputation mechanisms
  • Trust in e-commerce
  • Trust in the agent as ones representative
  • Trust in the marketplace infrastructure

8
Trust in agent technology
  • A trusting relationship between user and agent
    must develop
  • There is always reluctance to adopt new
    technologies, especially technologies that carry
    high risks
  • As agents are entrusted with private and
    sensitive information, they need to have built-in
    mechanisms to protect this information
  • Agents as faceless strangers
  • Twice removed from the interface tasks
  • Agents need to have mechanisms to enable them to
    decide who to trust and interact with

9
  • Gradual building of trust control vs trust
  • Agents behaviour can be controlled in three
    stages
  • Pre-activity
  • Real-time
  • Post-activity
  • As trust increases, control can be relinquished,
    until the user completely trusts the agent

10
Trust in the marketplace
  • Trust in the protocol, marketplace and other
    participants
  • Trust management and security need to be
    addressed
  • Electronic marketplaces
  • must address how they intend to provide trust,
    security, enforce contracts and establish a legal
    framework
  • Provide safeguards and guarantees against
    breaches of the protocol
  • Impose sanctions on those who deviate from the
    rules

11
Electronic institutions
  • Human societies have dealt with issues of trust
    through developing norms which guide, monitor and
    regulate behaviour
  • Institutions consist of norms and social
    constraints
  • Interaction protocols can be augmented with norms

12
Norms, institutions and organizations
The relationship between norms, institutions and
organizations
13
From norms to institutions
  • Norms that govern electronic institutions can be
    distinguished into
  • Ontological and communication norms enable clear
    and unambiguous communication
  • Social interaction norms dictate interaction
    protocols and describe correct sequences of
    activities
  • Norms that impose restrictions on the behaviour
    of individual agents normative rules that
    dictate permissible and acceptable behaviour
    within the institution, i.e. describe an agents
    obligations and rights

14
  • Norms act as deterrents, disincentives or
    preventative measures against unwanted behaviour
  • Norms that indicate prohibitions can be
    translated into regulations
  • Norms that indicate desirable behaviour, whenever
    possible, are translated into restrictions on
    unwanted behaviour
  • Norms that indicate that certain actions can be
    performed under certain conditions, can be
    translated into checking that the conditions have
    been satisfied prior to allowing the action

15
  • But, there are types of unwanted behaviour that
    cannot be translated into rules which can be
    easily enforced react to violations of norms
  • For each norm that an institution would like to
    enforce, a rule is required that specifies
    either
  • The procedure within the institution that
    enforces the norm or
  • The conditions that constitute violations of the
    norm and the consequences, or sanctions imposed
    by the institution

16
Acting within electronic institutions
  • Agents assume roles and commit to adhere to the
    institutions norms, namely policies
  • Commitments may take the form of contracts
  • Private contracts drawn between two or more
    agents
  • Social contracts a commitment on behalf of an
    agent towards the institution

17
  • Agents can be designed to conform to norms
  • They may be able to perceive and reason about the
    norms of each different institution and act
    accordingly
  • They may be designed so that they conform to a
    wide range of principles and not exhibit
    malicious behaviour, without being aware of the
    norms that different institutions impose
  • Inevitably institutions and norms restrict an
    agents autonomy

18
Reputation systems
  • Reputation encapsulates the distribute knowledge
    of a set of entities about another and is used to
    predict future behaviour
  • Reputation systems attempt to create the shadow
    of the future
  • A reputation system collects, aggregates and
    distributes information about the participants
    past behaviour
  • Example eBay feedback forum

19
  • For a reputation system to operate effectively
  • Participants must be long-lived entities
  • The cost of submitting and distributing feedback
    must be low
  • Feedback information must be aggregated and
    presented in a way that enables and guides
    trusting decisions
  • Clear guidelines on how the rating system
    operates and how conflicts are resolved
  • The reputation system itself must be reputable
    and trustworthy

20
Issues in reputation systems
  • Eliciting feedback
  • Users may be reluctant to provide feedback
  • Honest reporting is difficult to ensure
  • Unfair ratings
  • Unfairly high ratings (ballot stuffing)
  • Unfairly low ratings (bad-mouthing)
  • Sellers discriminatory behaviour towards buyers
  • Negative discrimination
  • Positive discrimination
  • Difficult to elicit negative feedback

21
  • Aggregating feedback
  • How is feedback aggregated so that it is useful
    to participants?
  • Usually simple numerical ratings fail to convey
    important information about the reported
    transactions
  • Did the feedback come from low or high-value
    transactions?
  • Were the evaluators themselves trustworthy?

22
  • Distributing feedback
  • Who has access to the reputation ratings?
  • Problem with the portability of ratings
  • Name changes (or the use of pseudonyms) present a
    problem
  • Ratings and reports can be easily falsified in
    centralized reputation systems use of
    decentralized systems

23
Security
  • Security encompasses mechanisms to ensure
  • Confidentiality
  • Integrity
  • Authentication
  • Access control
  • Availability
  • Nonrepudiation
  • Logging and auditing

24
Cryptography
  • Modern day cryptography the science of
    information security
  • Cryptology finding ways to encrypt a piece of
    information into a ciphertext in a secure way
  • Cryptanalysis discovering either the plaintext,
    the algorithm for encrypting it, or the secret
    key from the ciphertext
  • Cryptography is used in communications, digital
    signatures, electronic voting, digital cash
  • Cryptosystem a package of protocols and
    cryptographic algorithms including the
    instructions for encoding and decoding messages

25
Symmetric cryptosystems
  • Rely on the use of a secret key to encrypt and
    decrypt messages
  • A system remains secure provided that
  • The encryption algorithm should be hard to break
  • The key is secret
  • An encryption scheme is said to be
    computationally secure if the ciphertext
    generated meets one or both conditions
  • The cost of breaking the cipher exceeds the value
    of the encrypted information
  • The time required to break the cipher exceeds the
    useful lifetime of the information
  • The larger the key, the better its ability to
    withstand an exhaustion attack

26
  • How do you transport the secret key from the
    sender to the recipient in a secure way?
  • A key could be selected by one party and then
    physically delivered to the other one
  • A trusted third party could select the key and
    physically deliver it to the other two interested
    parties
  • If the two parties have an entrusted connection
    to a third trusted party, then they could act as
    the intermediate
  • Use a public key cryptosystem

27
Asymmetric cryptosystems
  • Also known as public key cryptosystems
  • No prior access to a secret key is required
  • Depends on mathematical one-way functions
    computations are easy to do but very hard to
    reverse, e.g. factoring
  • Two separate keys
  • Private key kept secret
  • Public key widely distributed

28
  • Each entity generates a pair of keys
  • One of the two keys is placed in a public
    registry or is sent to the other party, this now
    becomes the public key
  • The second key remains private
  • If B wants to communicate a message to A, it uses
    As public key to encrypt the message and sends
    it over. A is the only one who can decrypt the
    message, having the private key
  • Depending on the application, the sender uses
    either its own private key, the recipients, or
    both, to perform some type of cryptographic
    function

29
  • Applications
  • Encryption/decryption
  • Digital signatures
  • Key exchange
  • Cryptography for confidentiality and privacy
  • Cryptography for authentication, data integrity
    and nonrepudiation

30
Agents and privacy
  • Privacy denotes a condition or state in which a
    natural or legal person is more or less
    inaccessible to others, on the physical,
    psychological or informational plane
  • Here, privacy denotes a state of limited
    accessibility on the informational plane
  • Agent technology plays a mixed role
  • agents can render their users more vulnerable to
    loss of privacy
  • they can also be used as a means to safeguard
    privacy

31
Anonymity
  • Anonymity offers a form of privacy and is
    characterized by the fact that other parties do
    not know ones identity
  • Anonymity may be desirable in certain situations
  • Types of anonymity
  • Traceable anonymity
  • Untraceable anonymity
  • Untraceable pseudonymity
  • Traceable pseudonymity
  • Successful anonymization may be difficult to
    achieve and usually relies on third parties

32
  • Techniques that facilitate anonymity, may
    incidentally facilitate and support illegal and
    criminal activities and even encourage and
    dishonest and antisocial behaviour

33
Protecting privacy
  • Protecting sensitive, confidential and private
    information is imperative
  • A number of approaches
  • Place only minimal confidential information in an
    agent
  • Use cryptographic techniques
  • Use secure protocols when interacting with others
  • Provide the agent with appropriate protective
    strategies
  • Use access control policies to restrict an
    agents access to resources and information while
    at a host

34
  • Place no confidential information in the agent
  • Use anonymizer servers that may help prevent
    traffic analysis of the users actions and
    requests (and those of mobile agents)
  • Use a new agent for each task
  • Fit agents with privacy enhancing technologies
    such as an identity protector

35
Agents and the law
  • Agents in e-commerce engage in a number of
    activities that are significant from a legal
    perspective
  • They access computer systems, networks and data
  • They retrieve and distribute information
  • They mediate personal and business relations
  • They negotiate for and buy and sell goods and
    services
  • The use of agents that conduct business on behalf
    of natural or legal persons raises issues with
    regard to accountability and liability

36
Interested parties
  • Parties that are involved in the use of agents
    whose interests are affected by any legal
    considerations
  • Agent designer/developer
  • Agent supplier or provider
  • User
  • Third parties

37
Issues
  • Providing a legal framework for guiding and
    regulating agent-based interactions and exchanges
    is important to the success and wider adoption of
    the technology
  • Who can be held liable and on what grounds?
  • Tort or wrongful acts
  • Privacy and data protection
  • Intellectual property rights
  • Product liability
  • Contractual liabilities
  • Criminal responsibility
  • International nature of transactions also poses a
    problem

38
Current legal frameworks
  • United States
  • UCITA, UETA and E-sign
  • Canada
  • UECA
  • Agents do not feature in any European initiatives
    or current legislation

39
Agents as legal persons
  • Legal systems recognize two kinds of persons
    natural and legal
  • Contracts are formed between these types of
    persons
  • But agents do not have a legal status this
    poses a problem
  • Can agents close contracts?
  • Only legal or natural persons can close contracts
    (they need to have will and understand
    intentions)
  • To act as ones representative an agent needs to
    have contractual capability
  • Unless the will or intention to perform action is
    distinguished from the action itself

40
Software agents as e-persons
  • Can they be considered as legal personas?
    e-persons?
  • This would solve some problems regarding the
    validity of the declarations and contracts
  • But
  • First we would have to answer what constitutes an
    agent
  • Agents must have a residence or domicile (their
    users?)
  • Agent registers could be set up and agents could
    have a unique id this raises some privacy
    issues who has access to the registry?
  • They could also be granted by their users a
    patrimony
  • Their operations could be covered by insurance
    policies
About PowerShow.com