Software Agents - PowerPoint PPT Presentation

1 / 38
About This Presentation
Title:

Software Agents

Description:

Title: Software Agents Author: Chris Brooks Last modified by: Chris Brooks Created Date: 1/22/2003 1:25:22 AM Document presentation format: On-screen Show – PowerPoint PPT presentation

Number of Views:120
Avg rating:3.0/5.0
Slides: 39
Provided by: chrisb2
Learn more at: https://www.cs.usfca.edu
Category:

less

Transcript and Presenter's Notes

Title: Software Agents


1
Software Agents
  • CS 486
  • January 29, 2003

2
What is an agent?
  • Agent is one of the more ubiquitous buzzwords
    in computer science today.
  • Its used for almost any piece of software
  • I know an agent when I see one (and the
    paperclip is not one.)

3
Examples
  • News-filtering agents
  • Shopbots/price comparison agents
  • Bidding agents
  • Recommender agents
  • Personal Assistants
  • Middle agents/brokers
  • Etc.

4
Real-world agents
  • Secret Agents
  • Travel Agents
  • Real Estate Agents
  • Sports/Showbiz Agents
  • Purchasing Agents
  • What do these jobs have in common?

5
What is an agent?
  • An agent is a program with
  • Sensors (inputs)
  • Effectors (outputs)
  • An environment
  • The ability to map inputs to outputs
  • But what program isnt an agent, then?

6
What is an agent?
  • Can perform domain-oriented reasoning.
  • Domain-oriented a program has some specific
    knowledge about a particular area.
  • Not completely general.
  • Reasoning What does this mean?
  • Does the program have to plan ahead?
  • Can it be reactive?
  • Must it be declarative?

7
What is an agent?
  • Agents must be able to
  • Communicate
  • Negotiate
  • But what do these terms mean? Language?
  • Are pictures, GUIs communication?
  • How sophisticated is negotiation?
  • Communication should degrade gracefully

8
What is an agent?
  • Lives in a complex, dynamic environment
  • Getting at the notion of a complicated problem
  • Has a set of goals
  • An agent must have something it intends to do.
  • (well return to this idea.)
  • Persistent state
  • Distinguishes agents from subroutines, servlets,
    etc.

9
What is an agent?
  • Autonomy/Autonomous execution
  • Websters
  • Autonomy The quality or state of being
    self-governing
  • More generally, being able to make decisions
    without direct guidance.
  • Authority, responsibility

10
Autonomy
  • Autonomy is typically limited or restricted to a
    particular area.
  • Locus of decision making
  • Within a prescribed range, an agent is able to
    decide for itself what to do.
  • Find me a flight from SF to NYC on Monday.
  • Note I didnt say what to optimize Im
    allowing the agent to make tradeoffs.

11
What is an agent?
  • Not black and white
  • Like object, its more a useful
    characterization than a strict category
  • It makes sense to refer to something as an agent
    if it helps the designer to understand it.
  • Some general characteristics
  • Autonomous, goal-oriented, flexible, adaptive,
    communicative, self-starting

12
Objects vs. Agents
  • So how are agents different from objects.
  • Objects passive, noun-oriented, receivers of
    action.
  • Agents active, task-oriented, able to take
    action without receiving a message.

13
Examples of agent technology, revisited.
  • Ebay bidding agents
  • Very simple can watch an auction and increment
    the price for you.
  • Shopping agents (Dealtime, evenBetter)
  • Take a description of an item and search shopping
    sites.
  • Are these agents?
  • Recommender systems (Firefly, Amazon, Launch,
    Netflix)
  • Users rate some movies/music/things, and the
    agent suggests things they might like.
  • Are these agents?

14
More examples of agents
  • Brokering
  • Constructing links between
  • Merchants
  • Certificate Authorities
  • Customers
  • Agents
  • Auction agents
  • Negotiate payment and terms.
  • Conversational/NPC agents (Julia)
  • Remote Agent (NASA)

15
The Intentional Stance
  • We often speak of programs as if they are
    intelligent, sentient beings
  • The compiler cant find the linker.
  • The database wants the schema to be in a
    different format.
  • My program doesnt like that input. It expects
    the last name first.
  • Treating a program as if it is intelligent is
    called the intentional stance.
  • It doesnt matter whether the program really is
    intelligent its helpful to us as programmers to
    think as if it is.

16
The Knowledge Level
  • The intentional stance leads us to program agents
    at the knowledge level (Newell).
  • Reasoning about programs in terms of
  • Facts
  • Goals
  • Desires/needs/wants/preferences
  • Beliefs
  • This is often referred to as declarative
    programming.
  • We can think of this as an abstraction, just like
    object-oriented programming.
  • Agent-oriented programming

17
Example
  • Consider an agent that will find books for me
    that Im interested in.
  • States a declarative representation of outcomes.
  • hasBook(Moby Dick)
  • Facts Categories of books, bookseller websites,
    etc.
  • Preferences a ranking over states
  • hasBook(Neuromancer) gt hasBook(MobyDick)
  • hasBook(B) category(B) SciFi gt
  • hasBook(b2) category(b2) Mystery

18
Example
  • Goals find a book that satisfies my preferences.
  • Take actions that improve the world state.
  • Beliefs used to deal with uncertainty
  • May(likes(Chris, Harry Potter))
  • Prob(likes(Chris, Harry Potter)) 0.10

19
Rational Machines
  • How do we determine the right thing for an agent
    to do?
  • If the agents internal state can be described at
    the knowledge level, we can describe the
    relationship between its knowledge and its goals.
  • Newells Principle of Rationality
  • If an agent has the knowledge that an action will
    lead to the accomplishment of one of its goals,
    then it will select that action.

20
Preferences and Utility
  • Agents will typically have preferences
  • This is declarative knowledge about the relative
    value of different states of the world.
  • I prefer ice cream to spinach.
  • Often, the value of an outcome can be quantified
    (perhaps in monetary terms.)
  • This allows the agent to compare the utility (or
    expected utility) of different actions.
  • A rational agent is one that maximizes expected
    utility.

21
Example
  • Again, consider our book agent.
  • If I can tell it how much value I place on
    different books, it can use this do decide what
    actions to take.
  • Prefer(SciFi, Fantasy)
  • prefer(SciFi, Mystery)
  • Like(Fantasy), Like(Mystery)
  • like(book) not_buying(otherBook) -gt
    buy(book).
  • How do we choose whether to buy Fantasy or
    Mystery?

22
Example
  • If my agent knows the value I assign to each
    book, it can pick the one that will maximize my
    utility. (value price).
  • V(fantasy) 10, p(fantasy) 7
  • V(mystery) 6, p(mystery) 4
  • Buy fantasy.
  • V(fantasy) 10, p(fantasy) 7
  • V(mystery) 6, p(mystery) 1
  • Buy mystery.

23
Utility example
  • Game costs 1 to play.
  • Choose Red, win 2.
  • Choose Black, win 3.
  • A utility-maximizing agent will pick Black.
  • Game costs 1 to play.
  • Choose Red, win 50 cents
  • Choose Black, win 25 cents
  • A utility-maximizing agent will choose not to
    play. (If it must play, it picks Red)

24
Utility example
  • But actions are rarely certain.
  • Game costs 1.
  • Red, win 1.
  • Black 30 chance of winning 10, 70 chance of
    winning 0.
  • A risk-neutral agent will pick Black.
  • What if the amounts are in millions?

25
Rationality as a Design Principle
  • Provides an abstract description of behavior.
  • Declarative avoids specifying how a decision is
    reached.
  • Leaves flexibility
  • Doesnt enumerate inputs and outputs.
  • Scales, allows for diverse envoronments.
  • Doesnt specify failure or unsafe states.
  • Leads to accomplishment of goals, not avoidance
    of failure.

26
Agents in open systems
  • Open systems are those in which no one implements
    all the participants.
  • E.g. the Internet.
  • System designers construct a protocol
  • Anyone who follows this protocol can participate.
    (e.g. HTTP, TCP)
  • How to build a protocol that leads to desirable
    behavior?
  • What is desirable?

27
Protocol design
  • By treating participants as rational agents, we
    can exploit techniques from game theory and
    economics.
  • Assume everyone will act to maximize their own
    payoff how do we change the rules of the game
    so that this behavior leads to a desired
    outcome?
  • Well return to this idea when we talk about
    auctions.

28
Agents and Mechanisms
  • System designers can treat external programs as
    if they were rational agents.
  • That is, treat external programs as if they have
    their own beliefs, goals and agenda to achieve.
  • For example an auction can treat bidding agents
    as if they are actually trying to maximize their
    own profit.

29
Agents and Mechanisms
  • In many cases, a system designer cannot directly
    control agent behavior.
  • In an auction, the auctioneer cant tell people
    what to bid.
  • The auctioneer can control the mechanism
  • The rules of the game
  • Design goal construct mechanisms that lead to
    self-interested agents doing the right thing.

30
Mechanism Example
  • Imagine a communication network G with two
    special nodes x and y.
  • Edges between nodes are agents that can forward
    messages.
  • Each agent has a private cost t to pass a message
    along its edge.
  • If agents will reveal their ts truthfully, we
    can compute the shortest path between x and y.
  • How to get a self-interested agent to reveal its
    t?

31
Solution
  • Each agent reveals a t and the shortest path is
    computed.
  • Costs are accumulated
  • If an agent is not on the path, it is paid 0.
  • If an agent is on the path, it is paid the cost
    of the path the cost of the shortest path that
    doesnt include it - t.
  • P gnext (gbest t)
  • For example, if I bid 10 and am in a path with
    cost 40, and the best solution without me is 60,
    I get paid 60 (40 10) 30
  • Agent compensated for its contribution to the
    solution.

32
Analysis
  • If an agent lies
  • Was on the shortest path, still on the shortest
    path.
  • Payment lower no benefit to lying.
  • Was on the shortest path, now not on the shortest
    path.
  • This means the lie was greater than gnext gbest
  • But I would rather get a positive amount than 0!
  • Not on the shortest path, but now are.
  • Underbidding leads to being paid at the lower
    amount, but still incurring higher cost.
  • Truth would be better!

33
Example
  • Cost 2, SP 5, NextSP 8
  • My payment if I bid truthfully
  • 8 (5 2) 5. Net 5 2 3.
  • If I underbid, my payment will be lower and net
    cost higher.
  • If I overbid, I either get 0, or the same
    utility.
  • e.g if I bid 3, I get 8 (5 3) 6, but my
    net
  • is 6 3 3.
  • Therefore, truthtelling is a dominant strategy.

34
Adaptation and Learning
  • Often, its not possible to program an agent to
    deal with every contingency
  • Uncertainty
  • Changing domain
  • Too complicated
  • Agents must often need to adapt to changing
    environments or learn more about their
    environment.

35
Adaptation
  • Adaptation involves changing an agents
    model/behavior in response to a perceived change
    in the world.
  • Reactive
  • Agents dont anticipate the future, just update

36
Learning
  • Learning involves constructing and updating a
    hypothesis
  • An agent typically tries to build and improve
    some representation of the world.
  • Proactive
  • Try to anticipate the future.
  • Most agents will use both learning and adaptation.

37
Agents in e-commerce
  • Agents play a fairly limited role in todays
    e-commerce.
  • Mostly still in research labs.
  • Large potential both in B2C and B2B
  • Assisting in personalization
  • Automating payment

38
Challenges for agents
  • Uniform protocol/language
  • Web services? XML?
  • Lightweight, simple to use, robust.
  • Always a challenge
  • Critical mass
  • Enough people need to adopt
  • Killer app
  • What will the agent e-mail/IM be?
Write a Comment
User Comments (0)
About PowerShow.com