Short Term Scheduling - PowerPoint PPT Presentation

1 / 56
About This Presentation
Title:

Short Term Scheduling

Description:

Gantt Chart for Johnson's Algorithm Example. Short task on M1 to 'load up' quickly. ... Choose one the one that reduces average tardiness the most ... – PowerPoint PPT presentation

Number of Views:121
Avg rating:3.0/5.0
Slides: 57
Provided by: Hyuns6
Category:

less

Transcript and Presenter's Notes

Title: Short Term Scheduling


1
Short Term Scheduling
2
Characteristics
  • Planning horizon is short
  • Multiple unique jobs (tasks) with varying
    processing times and due dates
  • Multiple unique jobs sharing the same set of
    resources (machines)
  • Time is treated as continuous (not discretized
    into periods)
  • Varying objective functions

3
Characteristics (Continued)
  • Common in make-to-order environments with high
    product variety
  • Common as a support tool for MRP in generating
    detailed schedules once orders have been released

4
Example
  • Two jobs, A and B
  • Two machines M1 and M2
  • Jobs are processed on M1 and then on M2
  • Job A 9 minutes on M1 and 2 minutes on M2
  • Job B 4 minutes on M1 and 9 minutes on M2

5
Example
6
Example (Continued)
7
Challenge
As the number of jobs increases, complete
enumeration becomes difficult 3! 6, 4! 24,
5! 120, 6! 720, 10! 3,628,800,
while 13! 6,227,020,800 25!
15,511,210,043,330,985,984,000,000
8
Classification of Scheduling Problems
  • Number of jobs
  • Number of machines
  • Type of production facility
  • Single machine
  • Flow shop
  • parallel machines
  • job shop
  • Job arrivals
  • Static
  • Dynamic
  • Performance measures

9
A Single Machine Example

Jobs 1 2 3 4 5 6 Processing time, pj 12
8 3 10 4 18 Release time, rj -20 -15
12 -10 3 2 Due date, dj 10 2 72 -8
8 60
10
The Single Machine Problem
  • Single machine scheduling problems seek an
    optimal sequence (for a given criterion) in which
    to complete a given collection of jobs on a
    single machine that can accommodate only one job
    at a time.

11
Decision Variables
  • xj time job j is started (relative to time 0
    now), xj ? max(0, rj) for all values of j.

12
Sequencing constraints
  • - (start time of j) (processing time of j) lt
    start time of j

  • or
  • - (start time of j) (processing time of j) lt
    start time of j

13
Sequencing constraints
  • - (start time of j) (processing time of j) ?
    start time of j

  • or
  • - (start time of j) (processing time of j) ?
    start time of j
  • xj pj ? xj or xj pj ? xj

14
Disjunctive variables
  • - Introduce disjunctive variables yjj, yjj
    1 if job j is scheduled before job j and yjj
    0 otherwise.
  • xj pj ? xj M(1 - yjj),
  • xj pj ? xj Myjj,
  • for all pairs of j and j (for every j and every
    j gt j), M is a large positive constant

15
Example
Formulate sequencing constraints for a problem
with three jobs, j 1, , 3 with processing
times 14, 3, and 7.
16
Due date constraints
xj pj ? dj, for all values of j
17
Examples of Performance measures
18
Example
Jobs 1 2 3 Processing time 15 6 9 Release
time 5 10 0 Due date 20 25 36 Start time
9 24 0
19
Objective functions
20
Objective functions
21
Formulation Minimizing Makespan (Maximum
Completion Time)
22
A Formulation with a Linear Objective Function
23
  • Similar formulations can be constructed with
    other min-max objective functions, such as
    minimizing maximum lateness or maximum tardiness.
  • Other objective functions involving minimizing
    means (other than mean tardiness) are already
    linear.

24
The Job Shop Scheduling Problem
  • N jobs
  • M Machines
  • A job j visits in a specified sequence a subset
    of the machines

25
Notation
  • pjm processing time of job j on machine m,
  • xjm start time of job j on machine m,
  • yj,j,m 1 if job j is scheduled before job j
    on machine m,
  • M(j) The subset of the machines visited by job
    j,
  • SS(m, j) the set of machines that job j visits
    after visiting machine m

26
Formulation
27
Solution Methods
  • Small to medium problems can be solved exactly
    (to optimality) using techniques such as branch
    and bound and dynamic programming
  • Structural results and polynomial (fast)
    algorithms for certain special cases
  • Large problems in general may not solve within a
    reasonable amount of time (the problem belongs to
    a class of combinatorial optimization problems
    called NP-hard)
  • Large problems can be solved approximately using
    heuristic approaches

28
Single Machine Results
  • Makespan
  • Not affected by sequence
  • Average Flow Time
  • Minimized by performing jobs according to the
    shortest processing time (SPT) order
  • Average Lateness
  • Minimized by performing in shortest processing
    time (SPT) order
  • Maximum Lateness (or Tardiness)
  • Minimized by performing in earliest due date
    (EDD) order.
  • If there exists a sequence with no tardy jobs,
    EDD will do it

29
Single Machine Results (Continued)
  • Average Weighted Flow Time
  • Minimized by performing according to the
    smallest processing time ratio (processing
    time/weight) order
  • Average Tardiness
  • No simple sequencing rule will work

30
Two Machine Results
  • Given a set of jobs that must go through a
    sequence of two machines, what sequence will
    yield the minimum makespan?

31
Johnsons Algorithm
  • A Simple algorithm (Johnson 1954)
  • 1. Sort the processing times of the jobs on the
    two machines in two lists.
  • 2. Find the shortest processing time in either
    list and remove the corresponding job from both
    lists.
  • If the job came from the first list, place it in
    the first available position in the sequence.
  • If the job came from the second list, place it in
    the last available position in sequence.
  • 3. Repeat until are all jobs have been sequenced.
  • The resulting sequence minimizes makespan.

32
  • Data

33
Johnsons Algorithm Example
  • Data
  • Iteration 1 min time is 4 (job 1 on M1) place
    this job first and remove from both lists

34
  • Data

35
Johnsons Algorithm Example (Continued)
  • Iteration 2 min time is 5 (job 3 on M2) place
    this job last and remove from lists
  • Iteration 3 only job left is job 2 place in
    remaining position (middle).
  • Final Sequence 1-2-3
  • Makespan 28

36
Gantt Chart for Johnsons Algorithm Example
Short task on M2 to clear out quickly.
Short task on M1 to load up quickly.
37
Three Machine Results
  • Johnsons algorithm can be extended to three
    machines by creating two composite machines (M1
    M1 M2) and (M2 M2 M3) and then applying
    Johnsons algorithm to these two machines
  • Optimality is guaranteed only when certain
    conditions are met
  • smallest processing time on M1 is greater or
    equal than largest processing on machine 2, or
  • smallest processing time on M3 is greater or
    equal than largest processing on machine 2

38
Multi-Machine Results
  • Generate M-1 pairs of dummy machines
  • Example with 4 machines, we have the following
    three pairs (M1, M4), (M1M2, M3M4), (M1M2M3,
    M2M3M4)
  • Apply Johnsons algorithm to each pair and select
    the best resulting schedule out of the M-1
    schedules generated
  • Optimality is not guaranteed.

39
Dispatching Rules
  • In general, simple sequencing rules
    (dispatching rules) do not lead to optimal
    schedules. However, they are often used to solve
    approximately (heuristically) complex scheduling
    problems.
  • Basic Approach
  • Decompose a multi-machine problem (e.g., a job
    shop scheduling problem) into sub-problems each
    involving a single machine.
  • Use a simple dispatching rule to sequence jobs on
    each of these machines.

40
Example Dispatching Rules
  • FIFO simplest, seems fair.
  • SPT Actually works quite well with tight due
    dates.
  • EDD Works well when jobs are mostly the same
    size.
  • Critical ratio (time until due date/work
    remaining) - Works well for tardiness measures
  • Many (100s) others.

41
Heuristics Algorithms
  • Construction heuristics
  • Use a procedure (a set of rules) to construct
    from scratch a good (but not necessarily optimal)
    schedule
  • Improvement heuristics
  • Starting from a feasible schedule (possibly
    obtained using a construction heuristic), use a
    procedure to further improve the schedule

42
Example A Single Machine with Setups
  • N jobs to be scheduled on a single machine with a
    sequence dependent setup preceding the processing
    of each job.
  • The objective is to identify a sequence that
    minimizes makespan.
  • The problem is an instance of the Traveling
    Salesman Problem (TSP).
  • The problem is NP-hard (the number of
    computational steps required to solve the problem
    grow exponentially with the number of jobs).

43
A Heuristic Algorithm
44
A Heuristic Algorithm
  • Greedy heuristic Start with an arbitrary job
    from the set of N jobs. Schedule jobs
    subsequently based on next shortest setup time.

45
A Heuristic Algorithm
  • Greedy heuristic Start with an arbitrary job
    from the set of N jobs. Schedule jobs
    subsequently based on next shortest setup time.
  • Improved greedy heuristic Evaluate sequences
    with all possible starting jobs (N different
    schedules). Choose schedule with the shortest
    makespan.

46
A Heuristic Algorithm
  • Greedy heuristic Start with an arbitrary job
    from the set of N jobs. Schedule jobs
    subsequently based on next shortest setup time.
  • Improved greedy heuristic Evaluate sequences
    with all possible starting jobs (N different
    schedules). Choose schedule with the shortest
    makespan.
  • Improved heuristic Starting from the improved
    greedy heuristic solution carry out a series of
    pair-wise interchanges in the job sequence. Stop
    when solution stops improving.

47
A Problem Instance
  • 16 jobs
  • Each job takes 1 hour on single machine (the
    bottleneck resource)
  • 4 hours of setup to change over from one job
    family to another
  • Fixed due dates
  • Find a solution that minimizes tardiness

48
EDD Sequence
  • Average Tardiness 10.375

49
A Greedy Search
  • Consider all pair-wise interchanges
  • Choose one the one that reduces average tardiness
    the most
  • Continue until no further improvement is possible

50
First Interchange Exchange Jobs 4 and 5.
  • Average Tardiness 5.0 (reduction of 5.375!)

51
Greedy Search Final Sequence
Average Tardiness 0.5 (9.875 lower than EDD)
52
A Better (Due-Date Feasible) Sequence
  • Average Tardiness 0

53
Computational Times
  • Current situation computers can examine
    1,000,000 sequences per second and we wish to
    build a scheduling system that has a response
    time of no longer than one minute. How many jobs
    can we sequence optimally (using a brute force
    approach)?

54
Effect of Faster Computers
  • Future Situation New computers will be 1,000
    times faster, i.e. it can do 1 billion
    comparisons per second). How many jobs can we
    sequence optimally now?

55
Implications for Real Problems
  • Computation NP (non-polynomial) algorithms are
    slow to use
  • No Technology Fix Faster computers do not help
    on NP algorithm.
  • Exact Algorithms Need for specialized algorithms
    that take advantage of the structure of the
    problem to reduce the search space.
  • Heuristics Likely to continue to be the dominant
    approach to solving large problems in practice
    (e.g., multi-step exchange algorithms, Genetic
    Algorithms, Simulated Annealing, Tabu Search,
    among others)

56
Implications for Real Problems (Continued)
  • Robustness NP hard problems have many solutions,
    and presumably many good ones.
  • Example 25 job sequence problem. Suppose that
    only one in a trillion of the possible solutions
    is good. This still leaves 15 trillion good
    solutions. Our task is to find one of these.
  • Focus on Bottleneck We can often concentrate on
    scheduling the bottleneck process, which
    simplifies the problem closer to a single machine
    case.
Write a Comment
User Comments (0)
About PowerShow.com