Introduction to Artificial Intelligence Local Search and Constraint Satisfaction - PowerPoint PPT Presentation

1 / 68
About This Presentation
Title:

Introduction to Artificial Intelligence Local Search and Constraint Satisfaction

Description:

heuristic h(s) is the 'objective function', no need to be admissible ... 2) Munich 4) Wesbaden 6) Hanover 8) Breme. CityList1 (3 5 7 2 1 6 4 8) ... – PowerPoint PPT presentation

Number of Views:151
Avg rating:3.0/5.0
Slides: 69
Provided by: kau59
Category:

less

Transcript and Presenter's Notes

Title: Introduction to Artificial Intelligence Local Search and Constraint Satisfaction


1
Introduction to Artificial IntelligenceLocal
SearchandConstraint Satisfaction
  • Henry Kautz

2
Local Search in Continuous Spaces
gradient
negative step to minimize f positive step to
maximize f
3
Local Search in Discrete State Spaces
  • state choose_start_state()
  • while ! GoalTest(state) do
  • state arg min h(s) s in Neighbors(state)
  • end
  • return state
  • Terminology
  • neighbors instead of children
  • heuristic h(s) is the objective function, no
    need to be admissible
  • No guarantee of finding a solution
  • sometimes probabilistic guarantee
  • Best goal-finding, not path-finding
  • Many variations

4
Local Search versus Systematic Search
  • Systematic Search
  • BFS, DFS, IDS, Best-First, A
  • Keeps some history of visited nodes
  • Always complete for finite search spaces, some
    versions complete for infinite spaces
  • Good for building up solutions incrementally
  • State partial solution
  • Action extend partial solution

5
Local Search versus Systematic Search
  • Local Search
  • Gradient descent, Greedy local search, Simulated
    Annealing, Genetic Algorithms
  • Does not keep history of visited nodes
  • Not complete. May be able to argue will terminate
    with high probability
  • Good for iterative repair of candidate solutions
  • State complete candidate solution that may not
    satisfy all constraints
  • Action make a small change in the candidate
    solution

6
N-Queens Problem
7
N-Queens Systematic Search
  • state choose_start_state()
  • add state to Fringe
  • while ! GoalTest(state) do
  • choose state from Fringe according to h(state)
  • Fringe Fringe U Children(state)
  • end
  • return state
  • start empty board
  • GoalTest N queens are on the board
  • h (N number of queens on the board)
  • children all ways of adding one queen without
    creating any attacks

8
N-Queens Local Search, V1
  • state choose_start_state()
  • while ! GoalTest(state) do
  • state arg min h(s) s in Neighbors(state)
  • end
  • return state
  • start put down N queens randomly
  • GoalTest Board has no attacking pairs
  • h number of attacking pairs
  • neighbors move one queen to a different square
    on the board

9
N-Queens Local Search, V2
  • state choose_start_state()
  • while ! GoalTest(state) do
  • state arg min h(s) s in Neighbors(state)
  • end
  • return state
  • start put a queen on each square with 50
    probability
  • GoalTest Board has N queens, no attacking pairs
  • h (number of attacking pairs max(0, N -
    queens))
  • neighbors add or delete one queen

10
N Queens Demo
11
States Where Greedy Search Must Succeed
objective function
12
States Where Greedy Search Might Succeed
objective function
13
Local Search Landscape
objective function
14
Variations of Greedy Search
  • Where to start?
  • RANDOM STATE
  • PRETTY GOOD STATE
  • What to do when a local minimum is reached?
  • STOP
  • KEEP GOING
  • Which neighbor to move to?
  • BEST neighbor
  • Any BETTER neighbor (Hill Climbing)
  • How to make local search more robust?

15
Restarts
  • for run 1 to max_runs do
  • state choose_start_state()
  • flip 0
  • while ! GoalTest(state) flip lt max_flips do
  • state arg min h(s) s in Neighbors(state)
  • end
  • if GoalTest(state) return state
  • end
  • return FAIL

16
Uphill Moves Random Noise
  • state choose_start_state()
  • while ! GoalTest(state) do
  • with probability noise do
  • state random member Neighbors(state)
  • else
  • state arg min h(s) s in
    Neighbors(state)
  • end
  • end
  • return state

17
Uphill Moves Simulated Annealing (Constant
Temperature)
  • state start
  • while ! GoalTest(state) do
  • next random member Neighbors(state)
  • deltaE h(next) h(state)
  • if deltaE ? 0 then
  • state next
  • else
  • with probability e-deltaE/temperature do
  • state next
  • end
  • endif
  • end
  • return state

Book reverses, because is looking for max h state
18
(No Transcript)
19
Uphill Moves Simulated Annealing (Geometric
Cooling Schedule)
  • temperature start_temperature
  • state choose_start_state()
  • while ! GoalTest(state) do
  • next random member Neighbors(state)
  • deltaE h(next) h(state)
  • if deltaE ? 0 then
  • state next
  • else
  • with probability e-deltaE/temperature do
  • state next
  • end
  • temperature cooling_rate temperature
  • end
  • return state

20
Simulated Annealing
  • For any finite problem with a fully-connected
    state space, will provably converge to optimum as
    length of schedule increases
  • But fomal bound requires exponential search time
  • In many practical applications, can solve
    problems with a faster, non-guaranteed schedule

21
Other Local Search Strategies
  • Tabu Search
  • Keep a history of the last K visited states
  • Revisiting a state on the history list is tabu
  • Genetic algorithms
  • Population set of K multiple search points
  • Neighborhood population U mutations U
    crossovers
  • Mutation random change in a state
  • Crossovers random mix of assignments from two
    states
  • Typically only a portion of neighbor is generated
  • Search step new population K best members of
    neighborhood

22
Myopic Local Search
  • The local search methods we have discussed so far
    are myopic they only look at the immediate
    neighborhood of a single state at any one time
  • Simple Parallelism run many searches in parallel
    with different random seeds
  • Prob(Success) 1 Prob(Run Fails)k
  • E.g. Prob(Run Fails) 90, k 10 ?
    Prob(Success) 65

23
Multi-Point Local Search
  • We can (sometimes) do better by considering
    several points simultaneously and exchanging
    information between the search points
  • Two biological metaphors
  • Genetic algorithms
  • Swarm algorithms

24
Genetic algorithms
  • A successor state is generated by combining two
    parent states
  • Start with k randomly generated states
    (population)
  • A state is represented as a string over a finite
    alphabet (often a string of 0s and 1s)
  • Evaluation function (fitness function). Depending
    on problem, may want to MAXIMIZE or MINIMIZE.
  • Produce the next generation of states by
    selection, crossover, and mutation

25
Example 8-Queens
  • Fitness function number of non-attacking pairs
    of queens (min 0, max 8 7/2 28)
  • 24/(24232011) 31
  • 23/(24232011) 29 etc

Normalized Fitness
26
The GA Cycle of Reproduction
children
reproduction
modification
modified children
parents
evaluation
population
evaluated children
deleted members
discard
27
Population
population
  • Chromosomes could be
  • Bit strings
    (0101 ... 1100)
  • Real numbers (43.2 -33.1 ...
    0.0 89.2)
  • Permutations of element (E11 E3 E7 ... E1
    E15)
  • Lists of rules (R1 R2 R3
    ... R22 R23)
  • Program elements (genetic
    programming)
  • ... any data structure ...

28
Reproduction
children
reproduction
parents
population
Parents are selected at random with selection
chances biased in relation to chromosome
evaluations
29
Chromosome Modification
  • Modifications are stochastically triggered
  • Operator types are
  • Mutation
  • Crossover (recombination)

children
modification
modified children
30
Crossover mechanisms
31
(No Transcript)
32
Evaluation
  • The evaluator decodes a chromosome and assigns it
    a fitness measure
  • The evaluator is the only link between a
    classical GA and the problem it is solving

modified children
evaluated children
evaluation
33
Deletion
population
  • Generational GAentire populations replaced with
    each iteration
  • Steady-state GAa few members replaced each
    generation

discarded members
discard
34
Example Traveling Salesman Problem
  • Find a tour of a given set of cities so that
  • each city is visited only once
  • the total distance traveled is minimized

35
Representation
  • Representation is an ordered list of city
  • numbers known as an order-based GA.
  • 1) Berlin 3) Stuttgart 5) Cologne
    7) Dusseldorf
  • 2) Munich 4) Wesbaden 6) Hanover 8)
    Breme
  • CityList1 (3 5 7 2 1 6 4 8)
  • CityList2 (2 5 7 6 8 1 3 4)

36
Mutating Permutations
  • Changing just one entry in the permutation would
    give an inadmissible solution
  • Alternative
  • Pick two allele values at random
  • Move the second to follow the first, shifting
    the rest along to accommodate
  • Note that this preserves most of the order and
    the adjacency information

37
Crossover operators for permutations
  • Normal crossover operators will often lead to
    inadmissible solutions
  • Many specialised operators have been devised
    which focus on combining order or adjacency
    information from the two parents

38
Order 1 Crossover
  • Idea is to preserve relative order that elements
    occur
  • Informal procedure
  • 1. Choose an arbitrary part from the first parent
  • 2. Copy this part to the first child
  • 3. Copy the numbers that are not in the first
    part, to the first child
  • starting right from cut point of the copied part,
  • using the order of the second parent
  • and wrapping around at the end
  • 4. Analogous for the second child, with parent
    roles reversed

39
TSP Example 30 Cities
40
Solution i (Distance 941)
41
Solution j(Distance 800)
42
Solution k(Distance 652)
43
Best Solution (Distance 420)
44
Overview of Performance
45
Hardware Software Optimization
  • GA have been particularly successful for finding
    small circuits or code snippets that implement
    common functions
  • E.g. Sorting network parallel circuit that
    sorts a fixed number of inputs using compare /
    exchange operators

46
Representation
  • For design tasks like this, the genotype is not
    just a low-level bit-string, but a high-level
    data structure with meaningful sub-structure
  • Software parse tree
  • Circuit treat as a program
  • gates functions
  • wires variables

47
Swarm Algorithms, Briefly
  • Idea
  • Each insect in a swarm is local search process
  • At each step, each insect
  • Looks around its neighborhood
  • Decides which direction looks best
  • Communicates what it found out to (some of) the
    other insects
  • According to a random coin flip,
  • Moves in the direction that looks best locally
  • Moves in the best direction it hears about
  • Moves in some weighted combination of the above

48
Example Particle Swarm Optimization
49
Constraint Satisfaction
50
Guessing versus Reasoning
  • A central feature of all search algorithms is
    making guesses
  • Which node to expand?
  • Which child to explore?
  • We can reduce the amount of guesswork by doing
    more reasoning about the consequences of past
    decisions
  • Reasoning about past choices can prune away many
    future choices

51
Discrete Constraint Satisfaction Problem
  • X is a set of variables x1, x2, , xn
  • D is a set of finite domains D1, D2, , Dn
  • C is a set of constraints C1, C2, , Cm
  • Each constraint restricts the joint values of a
    subset of the variables
  • Example 3-Coloring
  • Xi countries
  • Di Red, Blue, Green
  • For each adjacent xi, xj there is a constraint
    Ck(xi,xj) ? (R,G), (R,B), (G,R), (G,B), (B,R),
    (B,G)

52
Kinds of Problems
  • Find a solution that satisfies all constraints
  • Find all solutions
  • Find a tightest form for each constraint
  • (x1,x2) ? (R,G), (R,B), (G,R), (G,B), (B,R),
    (B,G)
  • ?
  • (x1,x2) ? (R,G), (R,B)
  • Find a solution that minimizes some additional
    objective function

53
Solving CSP by Depth-First Search
  • Interleave inference and guessing
  • At each internal node
  • Select unassigned variable
  • Select a value in domain
  • Backtracking try another value
  • At each node
  • Propagate Constraints

Where are the guesses?
54
4 Queens
  • Q4 ? 1,2,3,4
  • Q3 ? 1,2,3,4
  • Q2 ? 1,2,3,4
  • Q1 ? 1,2,3,4

4
3
2
1
1 2 3 4
55
Constraint Checking
Takes 5 guesses to determine first guess was wrong
56
Forward Checking
When variable is set, immediately remove
inconsistent values from domains of other
variables
Takes 3 guesses to determine first guess was wrong
57
Arc Consistency
4
3
  • Iterated forward checking

2
1
1 2 3 4
58
Arc Consistency
4
3
  • Reduce domain of Q1

2
1
1 2 3 4
59
Arc Consistency
4
3
  • Reduce domain of Q2

2
1
1 2 3 4
60
Arc Consistency
4
3
  • Reduce domain of Q3

2
1
1 2 3 4
61
Arc Consistency
4
3
  • Reduce domain of Q4

2
1
1 2 3 4
62
Arc Consistency
4
3
  • Note that Q32 is determined by (Q2,Q3)

2
1
1 2 3 4
63
Arc Consistency
4
3
  • Propagating Q32 eliminates all rows in (Q3,Q4)

2
1
1 2 3 4
violated constraint!
64
Arc Consistency
4
3
  • First choice Q11 shown bad with no further
    guessing!

2
1
1 2 3 4
violated constraint!
65
Path Consistency
  • Path consistency (3-consistency)
  • Iterated check of every triple of variables
  • K-consistency
  • N-consistency backtrack-free search

66
Variable and Value Selection
  • Select variable with smallest domain
  • Minimize branching factor
  • Most likely to propagate, reduce guessing
  • Most constrained variable heuristic
  • Which values to try first?
  • Most likely value for solution
  • Forward checking reduces fewest constraints
  • Least constrained value heuristic

67
CSP Applications
  • Scheduling
  • NASA Shuttle repair, Hubble telescope
    scheduling, ...
  • College basketball scheduling
  • Configuration
  • 5ESS Switching System gt200 options, highly
    complex interactions
  • Reduced configuration time from 2 months ? 2
    hours

68
Router Design
  • Dynamic wavelength router design
  • each channel cannot be repeated in the same
    input port (row constraints)
  • each channel cannot be repeated in the same
    output port (column constraints)

(Barry and Humblet 93, Cheung et al. 90, Green
92, Kumar et al. 99)
Write a Comment
User Comments (0)
About PowerShow.com