Neural Network to solve Traveling Salesman Problem - PowerPoint PPT Presentation

About This Presentation
Title:

Neural Network to solve Traveling Salesman Problem

Description:

Neural Network to solve Traveling Salesman Problem Amit Goyal 01005009 Koustubh Vachhani 01005021 Ankur Jain 01D05007 Roadmap Hopfield Neural ... – PowerPoint PPT presentation

Number of Views:1062
Avg rating:3.0/5.0
Slides: 48
Provided by: cseIitbA
Category:

less

Transcript and Presenter's Notes

Title: Neural Network to solve Traveling Salesman Problem


1
Neural Network to solve Traveling Salesman Problem
  • Amit Goyal 01005009
  • Koustubh Vachhani 01005021
  • Ankur Jain 01D05007

2
Roadmap
  • Hopfield Neural Network
  • Solving TSP using Hopfield Network
  • Modification of Hopfield Neural Network
  • Solving TSP using Concurrent Neural Network
  • Comparison between Neural Network and SOM for
    solving TSP

3
Background
  • Neural Networks
  • Computing device composed of processing elements
    called neurons
  • Processing power comes from interconnection
    between neurons
  • Various models are Hopfield, Back propagation,
    Perceptron, Kohonen Net etc

4
Associative memory
  • Associative memory
  • Produces for any input pattern a similar stored
    pattern
  • Retrieval by part of data
  • Noisy input can be also recognized

Original
Degraded
Reconstruction
5
Hopfield Network
  • Recurrent network
  • Feedback from output to input
  • Fully connected
  • Every neuron connected to every other neuron

6
Hopfield Network
  • Symmetric connections
  • Connection weights from unit i to unit j and
    from unit j to unit i are identical for all i
    and j
  • No self connection, so weight matrix is
    0-diagonal and symmetric
  • Logic levels are 1 and -1

7
Computation
  • For any neuron i, at an instant t input is
  • Sj 1 to n, j?i wij sj(t)
  • sj(t) is the activation of the jth neuron
  • Threshold function ? 0
  • Activation si(t1)sgn(Sj1 to n, j?i wijsj(t))
  • where

Sgn(x) 1 xgt0
Sgn(x) -1 xlt0
8
Modes of operation
  • Synchronous
  • All neurons are updated simultaneously
  • Asynchronous
  • Simple Only one unit is randomly selected at
    each step
  • General Neurons update themselves independently
    and randomly based on probability distribution
    over time.

9
Stability
  • Issue of stability arises since there is a
    feedback in Hopfield network
  • May lead to fixed point, limit cycle or chaos
  • Fixed point unique point attractor
  • Limit cycles state space repeats itself in
    periodic cycles
  • Chaotic aperiodic strange attractor

10
Procedure
  • Store and stabilize the vector which has to be
    part of memory.
  • Find the value of weight wij, for all i, j such
    that
  • lts1, s2, s3 sNgt is stable in Hopfield Network
    of N neurons.

11
Weight learning
  • Weight learning is given by
  • wij 1/(N-1) si sj
  • 1/(N-1) is Normalizing factor
  • si sj derives from Hebbs rule
  • If two connected neurons are ON then weight of
    the connection is such that mutual excitation is
    sustained.
  • Similarly, if two neurons inhibit each other then
    the connection should sustain the mutual
    inhibition.

12
Multiple Vectors
  • If multiple vectors need to be stored in memory
    like
  • lts11, s21, s31 sN1gt
  • lts12, s22, s32 sN2gt
  • .
  • lts1p, s2p, s3p sNpgt
  • Then the weight are given by
  • wij 1/(N-1) Sm1 to psim sjm

13
Energy
  • Energy is associated with the state of the
    system.
  • Some patterns need to be made stable this
    corresponds to minimum energy state of the system.

14
Energy function
  • Energy at state s lts1, s2, s3 sNgt
  • E(s) -½ Si Sj?i wij sisj
  • Let the pth neuron change its state from
    spinitial to spfinal so
  • Einitial -½ Sj?p wpj spinitial sj T
  • Efinal -½ Sj?p wpj spfinal sj T
  • ?E Efinal Einitial
  • T is independent of sp

15
Continued
  • ?E - ½ (spfinal - spinitial ) Sj?p wpj sj
  • i.e. ?E -½ ?sp Sj?p wpj sj
  • Thus ?E -½ ?sp x (netinputp)
  • If p changes from 1 to -1 then ?sp is negative
    and netinputp is negative and vice versa.
  • So, ?E is always negative. Thus energy always
    decreases when neuron changes state.

16
Applications of Hopfield Nets
  • Hopfield nets are applied for Optimization
    problems.
  • Optimization problems maximize or minimize a
    function.
  • In Hopfield Network the energy gets minimized.

17
Traveling Salesman Problem
  • Given a set of cities and the distances between
    them, determine the shortest closed path passing
    through all the cities exactly once.

18
Traveling Salesman Problem
  • One of the classic and highly researched problem
    in the field of computer science.
  • Decision problem Is there a tour with length
    less than k" is NP - Complete
  • Optimization problem What is the shortest
    tour? is NP - Hard

19
Hopfield Net for TSP
  • N cities are represented by an N X N matrix of
    neurons
  • Each row has exactly one 1
  • Each column has exactly one 1
  • Matrix has exactly N 1s

skj 1 if city k is in position j skj 0
otherwise
20
Hopfield Net for TSP
  • For each element of the matrix take a neuron and
    fully connect the assembly with symmetric weights
  • Finding a suitable energy function E

21
Determination of Energy Function
  • E function for TSP has four components satisfying
    four constraints
  • Each city can have no more than one
  • position i.e. each row can have no more
  • than one activated neuron
  • E1 A/2 Sk Si Sj?i ski skj A -
    Constant

22
Energy Function (Contd..)
  • Each position contains no more than one city i.e.
    each column contains no more than one activated
    neuron
  • E2 B/2 Sj Sk Sr?k skj srj B - constant

23
Energy Function (Contd..)
  • There are exactly N entries in the output matrix
    i.e. there are N 1s in the output matrix
  • E3 C/2 (n - SkSi ski)2 C - constant

24
Energy Function (cont..)
  • Fourth term incorporates the requirement of the
    shortest path
  • E4 D/2 SkSr?kSj dkr skj(sr(j1) sr(j-1))
  • where dkr is the distance between city-k and
    city-r
  • Etotal E1 E2 E3 E4

25
Energy Function (cont..)
  • Energy equation is also given by
  • E -½SkiSrj w(ki)(rj) ski srj
  • ski City k at position i
  • srj City r at position j
  • Output function ski
  • ski ½ ( 1 tanh(uki/u0))
  • u0 is a constant
  • uki is the net input

26
Weight Value
  • Comparing above equations with the energy
    equation obtained previously
  • W(ki)(rj) -A dkr(1 drj) - Bdij(1 dkr) C
    Ddkr(dj(i1) dj(i-1))
  • Kronecker Symbol dkr
  • dkr 1 when k r
  • dkr 0 when k ? r

27
Observation
  • Choice of constants A,B,C and D that provide a
    good solution vary between
  • Always obtain legitimate loops (D is small
    relative to A, B and C)
  • Giving heavier weights to the distances (D is
    large relative to A, B and C)

28
Observation (cont..)
  • Local minima
  • Energy function full of dips, valleys and local
    minima
  • Speed
  • Fast due to rapid computational capacity of
    network

29
Concurrent Neural Network
  • Proposed by N. Toomarian in 1988
  • It requires N(log(N)) neurons to compute TSP of N
    cities.
  • It also has a much higher probability to reach a
    valid tour.

30
Objective Function
  • Aim is to minimize the distance between city k
    at position i and city r at position i1
  • Ei Sk?rSrSi dkidr(i1) dkr
  • Where d is the Kronecers Symbol

31
Cont
  • Ei 1/N2 Sk?rSrSi dkr ?i 1 to ln(N) 1 (2?i
    1) ski 1 (2µi 1) sri
  • Where (2µi 1) (2?i 1) 1 ?j 1 to i-1 ?i
  • Also to ensure that 2 cities dont occupy same
    position
  • Eerror Sk?rSr dkr

32
Solution
  • Eerror will have a value 0 for any valid tour.
  • So we have a constrained optimization problem to
    solve.
  • E Ei ? Eerror
  • ? is the Lagrange multiplier to be calculated
    form the solution.

33
Minimization of energy function
  • Minimizing Energy function which is in terms of
    ski
  • Algorithm is an iterative procedure which is
    usually used for minimization of quadratic
    functions
  • The iteration steps are carried out in the
    direction of steepest decent with respect to the
    energy function E

34
Minimization of energy function
  • Differentiating the energy
  • dUki/dt - dE/ dski - dEi/ dski -
    ?dEerror/ dski
  • d?/dt dE/ d? Eerror
  • ski tanh(aUki) , a const.

35
Implementation
  • Initial Input Matrix and the value of ? is
    randomly selected and specified
  • At each iteration, new value of ski and ? is
    calculated in the direction of steepest descent
    of energy function
  • Iterations will stop either when convergence is
    achieved or when the number of iterations exceeds
    a user specified number

36
Comparison Hopfield vs Concurrent NN
  • Converges faster than Hopfield Network
  • Probability to achieve valid tour is higher than
    Hopfield Network
  • Hopfield doesnt have systematic way to determine
    the constant terms.

37
Comparison SOM and Concurrent NN
  • Data set consists of 52 cities in Germany and its
    subset of 15 cities.
  • Both algorithms were run for 80 times on 15 city
    data set.
  • 52 city dataset could be analyzed only using SOM
    while Concurrent Neural Net failed to analyze
    this dataset.

38
Result
  • Concurrent neural network always converged and
    never missed any city, where as SOM is capable of
    missing cities.
  • Concurrent Neural Network is very erratic in
    behavior , whereas SOM has higher reliability to
    detect every link in smallest path.
  • Overall Concurrent Neural Network performed
    poorly as compared to SOM.

39
Shortest path generated
Concurrent Neural Network (2127 km)
Self Organizing Maps (1311km)
40
Behavior in terms of probability
Concurrent Neural Network
Self Organizing Maps
41
Conclusion
  • Hopfield Network can also be used for
    optimization problems.
  • Concurrent Neural Network performs better than
    Hopfield network and uses less neurons.
  • Concurrent and Hopfield Neural Network are less
    efficient than SOM for solving TSP.

42
References
  • N. K. Bose and P. Liang, Neural Network
    Fundamentals with Graphs, Algorithms and
    Applications, Tata McGraw Hill Publication,
    1996
  • P. D. Wasserman, Neural computing theory and
    practice, Van Nostrand Reinhold Co., 1989
  • N. Toomarian, A Concurrent Neural Network
    algorithm for the Traveling Salesman Problem,
    ACM Proceedings of the third conference on
    Hypercube concurrent computers and applications,
    pp. 1483-1490, 1988.

43
References
  • R. Reilly, Neural Network approach to solving
    the Traveling Salesman Problem, Journals of
    Computer Science in Colleges, pp. 41-61,October
    2003
  • Wolfram Research inc., Tutorial on Neural
    Networks, http//documents.wolfram.com/applicatio
    ns/neuralnetworks/NeuralNetworkTheory/2.7.0.html,
    2004
  • Prof. P. Bhattacharyya, Introduction to
    computing with Neural Nets,http//www.cse.iitb.ac
    .in/pb/Teaching.html.

44
(No Transcript)
45
NP-complete NP-hard
  • When a decision version of a combinatorial
    optimization problem is proved to belong to the
    class of NP-complete problems, which includes
    well-known problems such as satisfiability,traveli
    ng salesman, the bin packing problem, etc., then
    the optimization version is NP-hard.

46
NP-complete NP-hard
  • Is there a tour with length less than k" is
    NP-complete
  • It is easy to determine if a proposed
    certificate has length less than k
  • The optimization problem
  • "what is the shortest tour?", is NP-hard
    Since there is no easy way to determine if a
    certificate is the shortest.

47
Path lengths
Concurrent Neural Network
Self Organizing Maps
Write a Comment
User Comments (0)
About PowerShow.com