Particle Swarm Optimization PSO Algorithm and Its Application in Engineering Design Optimization - PowerPoint PPT Presentation

1 / 38
About This Presentation
Title:

Particle Swarm Optimization PSO Algorithm and Its Application in Engineering Design Optimization

Description:

Flow Chart for Extraction Procedure using PSO. Comparison of Genetic Algorithm and PSO ... No. of particle/ population size = 100. No. of simulation runs: 10000 ... – PowerPoint PPT presentation

Number of Views:8309
Avg rating:5.0/5.0
Slides: 39
Provided by: sman99
Category:

less

Transcript and Presenter's Notes

Title: Particle Swarm Optimization PSO Algorithm and Its Application in Engineering Design Optimization


1
Particle Swarm Optimization (PSO) Algorithmand
Its Application in Engineering Design Optimization
By
Sushanta Kumar Mandal Research Scholar
  • School of Information Technology
  • Indian Institute of Technology Kharagpur
  • September 9, 2005

2
Outline
  • Introduction to Optimization
  • Optimization Procedure
  • Different Optimization Algorithms
  • Different Global Optimization Algorithms
  • Particle Swarm Optimization (PSO) Algorithm
  • Application of PSO in Design Optimization Problems

3
Optimization
As ageless as time
4
Calculus
Maximum and minimum of a smooth function is
reached at a stationary point where its gradient
vanishes.
5
Optimization is Everywhere
  • The more we know about something, the more we see
    where optimization can be applied.
  • Some personal decision making
  • - Finding fastest route home or class
  • - Optimal allocation of time for home work
  • - Optimal budgeting

6
Goal of Optimization
Find values of the variables that minimize or
maximize the objective function while satisfying
the constraints.
7
Component of Optimization Problem
  • Objective Function An objective function which
    we want to minimize or maximize.
  • For example, in a manufacturing process, we might
    want to maximize the profit or minimize the cost.
  • In fitting experimental data to a user-defined
    model, we might minimize the total deviation of
    observed data from predictions based on the
    model.
  • In designing an inductor, we might want to
    maximize the Quality Factor and minimize the area.

8
Component of Optimization Problem
  • Design Variables A set of unknowns or
    variables which affect the value of the objective
    function.
  • In the manufacturing problem, the variables might
    include the amounts of different resources used
    or the time spent on each activity.
  • In fitting-the-data problem, the unknowns are the
    parameters that define the model.
  • In the inductor design problem, the variables
    used define the layout geometry of the panel.

9
Component of Optimization Problem
  • Constraints A set of constraints that allow the
    unknowns to take on certain values but exclude
    others.
  • For the manufacturing problem, it does not make
    sense to spend a negative amount of time on any
    activity, so we constrain all the "time"
    variables to be non-negative.
  • In the inductor design problem, we would probably
    want to limit the upper and lower value of layout
    parameters and to target an inductance value
    within the tolerance level.

10
Are All these ingredients necessary?
  • Almost all optimization problems have objective
    function.
  • No objective function. In some cases (for
    example, design of integrated circuit layouts),
    the goal is to find a set of variables that
    satisfies the constraints of the model. The user
    does not particularly want to optimize anything
    so there is no reason to define an objective
    function. This type of problems is usually called
    a feasibility problem.

11
Are All these ingredients necessary?
  • Variables are essential. If there are no
    variables, we cannot define the objective
    function and the problem constraints.
  • Constraints are not essential. In fact, the field
    of unconstrained optimization is a large and
    important one for which a lot of algorithms and
    software are available. It's been argued that
    almost all problems really do have constraints.

12
What We Need for Optimization
  • Models Modeling is the process of identifying
    objective function, variables and constraints.
    The goal of models is insight not the numbers.
    A good mathematical model of the optimization
    problem is needed.
  • Algorithms Typically, an interesting model is
    too complicated to be able to solve in with paper
    and pencil. An effective and reliable numerical
    algorithm is needed to solve the problem. There
    is no universal optimization algorithm. Algorithm
    should have robustness (good performance for a
    wide class of problems), efficiency (not too much
    computer time) and accuracy (can identify the
    error)

13
Flowchart of Optimal Design Procedure
Need for optimization
Choose design variables
Formulate constraints
Formulate objective function
Set up variable bounds
Select an optimization algorithm
Obtain solution(s)
14
Mathematical Formulation of Optimization Problems
15
Constraints
  • Inequality constraints x12 x22?0
  • Equality constraints x1 2

16
Variable Bounds
  • Maximum and minimum bounds on each design
    variable.
  • Without variable bounds the constraints
    completely surround the feasible region.
  • Variable bounds are used to confine the search
    algorithm within these bounds.
  • Ex

17
Classification of Optimization Methods
  • Single variable
  • Multi-variable
  • Constrained
  • Non-constrained
  • Single objective
  • Multi-objective
  • Linear
  • Non-linear

18
(No Transcript)
19
Classifications of Optimization Methods
20
Local and Global Optimizers
  • A local minimizer, xB, of the region B, is
    defined so that f(xB)?f(x), ?x?B.
  • Ex Gradient based search methods, Newton-Rapson
    algorithms, Steepest Decent, Conjugate-Gradient
    algorithms, Levenberg-Marquardt algorithm etc.
  • Shortcomings 1)  One requires an initial guess
    to start with. 2) Convergence to an optimal
    solution depends on the chosen initial guess.
    3) Most algorithms tend to get stuck to a
    sub-optimal solution. 4)  An algorithm efficient
    in solving one optimization problem may not be
    efficient in solving another one. 5) These are
    useful over a relatively narrow range.

21
Local and Global Optimizers
  • The global optimizer, x , is defined so that
    f(x)?f(x), ?x?S where S is the search space.
  • Ex. Simulated Annealing algorithm, Genetic
    Algorithm, Ant Colony, Geometric Programming,
    Particle Swarm Optimization etc.

22
Local and Global Optimizers
23
Particle Swarm Optimization
  • Evolutionary computational technique based on the
    movement and intelligence of swarms looking for
    the most fertile feeding location
  • It was developed in 1995 by James Kennedy and
    Russell Eberhart
  • Simple algorithm, easy to implement and few
    parameters to adjust mainly the velocity
  • A swarm is an apparently disorganized
    collection (population) of moving individuals
    that tend to cluster together while each
    individual seems to be moving in a random
    direction

24
Continued
  • It uses a number of agents (particles) that
    constitute a swarm moving around in the search
    space looking for the best solution.
  • Each particle is treated as a point in a
    D-dimensional space which adjusts its flying
    according to its own flying experience as well as
    the flying experience of other particles
  • Each particle keeps track of its coordinates in
    the problem space which are associated with the
    best solution (fitness) that has achieved so far.
    This value is called pbest.

25
Continued
  • Another best value that is tracked by the PSO is
    the best value obtained so far by any particle in
    the neighbors of the particle. This value is
    called gbest.
  • The PSO concept consists of changing the
    velocity(or accelerating) of each particle toward
    its pbest and the gbest position at each time
    step.

26
Continued
  • Each particle tries to modify its current
    position and velocity according to the distance
    between its current position and pbest, and the
    distance between its current position and gbest.

vn1 Velocity of particle at n1 th iteration Vn
Velocity of particle at nth iteration c1
acceleration factor related to gbest c2
acceleration factor related to lbest rand1( )
random number between 0 and 1 rand2( ) random
number between 0 and 1 gbest gbest position of
swarm pbest pbest position of particle
CurrentPositionn1 CurrentPositionn
vn1
current positionn1 position of particle at
n1th
iteration current positionn position of
particle at nth iteration vn1 particle
velocity at n1th iteration
27
PSO Algorithm
  • For each particle     Initialize particle with
    feasible random numberENDDo    For each
    particle         Calculate the fitness
    value        If the fitness value is better than
    the best fitness value (pbest) in
    history            Set current value as the new
    pbest    End
  • Choose the particle with the best fitness value
    of all the particles as the gbest    For each
    particle         Calculate particle velocity
    according to velocity update equation
  •         Update particle position according to
    position update equation     End
  • While maximum iterations or minimum error
    criteria is not attained

28
gbest and lbest
  • global version
  • vx vx 2rand( )(pbest
    presentx ) 2rand( )(pbestx gbest
    presentx )
  • local version
  • vx vx 2rand( )(pbest
    presebtx ) 2rand( )(pbestx lbest
    presentx )

29
Particle Swarm OptimizationSwarm Topology
  • In PSO, there have been two basic topologies used
    in the literature
  • Ring Topology (neighborhood of 3)
  • Star Topology (global neighborhood)

30
PSO Parameters Velocity
  • An important parameter in PSO typically the
    only one adjusted
  • Calmps particles velocities on each dimenson
  • Determines fineness with which regions are
    searched
  • If too high, can fly past optimal solutions
  • If too low, can get stuck in local minima

31
Flow Chart for Extraction Procedure using PSO
32
Comparison of Genetic Algorithm and PSO
Tested in a MATLAB Program, P4 1.7GHz CPU, 256M
RAM. No. of particle/ population size 100 No.
of simulation runs 10000
Crossover Prob. 0.9 Mutation Prob. .01
33
Model Fitting
34
Model Fitting
35
Inductor Optimization
36
Constrained PSO Optimization
37
(No Transcript)
38
Thank You
Write a Comment
User Comments (0)
About PowerShow.com