UET Multiprocessor Scheduling Problems - PowerPoint PPT Presentation

About This Presentation
Title:

UET Multiprocessor Scheduling Problems

Description:

Algorithms for Pk | prec, pj = 1 | Cmax. Future research directions and conclusions ... Graham's list algorithm. HLF algorithm. MSF algorithm. CG algorithm ... – PowerPoint PPT presentation

Number of Views:146
Avg rating:3.0/5.0
Slides: 66
Provided by: admi617
Learn more at: https://cseweb.ucsd.edu
Category:

less

Transcript and Presenter's Notes

Title: UET Multiprocessor Scheduling Problems


1
UET MultiprocessorScheduling Problems
  • Nan Zang
  • nzang_at_cs.ucsd.edu

2
Overview of the paper
  • Introduction
  • Classification
  • Complexity results for Pm prec, pj 1 Cmax
  • Algorithms for Pk prec, pj 1 Cmax
  • Future research directions and conclusions

3
Scheduling
  • Scheduling concerns optimal allocation of scarce
    resources to activities.
  • For example
  • Class scheduling problems
  • Courses
  • Classrooms
  • Teachers

4
Problem notation (1)
  • A set of n jobs J J1, J2, , Jn
  • The execution time of the job Jj is
    p(Jj).
  • A set of processors P P1 , P2,
  • Schedule
  • Specify which job should be executed by
    which
  • processor at what time.
  • Objective
  • Optimize one or more performance
    criteria

5
Problem Notation (2)
3-field notation aß? (Graham et
al.)
  • a describes the processor environment
  • Number of processors, speed,
  • ß provides the job characteristics
  • release time, precedence constraints,
    preemption,
  • ? represents the objective function to be
    optimized
  • The finishing time of the last job
    (makespan)
  • The total waiting time of the jobs

6
Job Characteristics
  • Release time (rj)
  • - earliest time at which job Jj can
    start processing.
  • - available job
  • Preemption (prmp)
  • - jobs can be interrupted during
    processing.
  • Precedence constraints (prec)
  • - before certain jobs are allowed to
    start processing, one or more jobs first have to
    be completed.
  • - a ready job

7
Precedence constraints (prec)
Before certain jobs are allowed to start
processing, one or more jobs first have to be
completed.
  • Definition
  • Successor
  • Predecessor
  • Immediate successor
  • Immediate predecessor
  • Transitive Reduction

8
Precedence constraints (prec)
One or more job have to be completed before
another job is allowed to start processing.
  • Definition
  • Successor
  • Predecessor
  • Immediate successor
  • Immediate predecessor
  • Transitive Reduction

9
Precedence constraints (prec)
One or more job have to be completed before
another job is allowed to start processing.
  • Definition
  • Successor
  • Predecessor
  • Immediate successor
  • Immediate predecessor
  • Transitive Reduction

10
Special precedence constraints (1)
  • In-tree (Out-tree)
  • In-forest (Out-forest)
  • Opposing forest
  • Interval orders
  • Quasi-interval orders
  • Over-interval orders
  • Series-parallel orders
  • Level orders

11
Special precedence constraints (2)
Out-forest
12
UET scheduling problem formal definition
  • Pm prec, pj 1 Cmax (m1)
  • Processor Environment
  • m identical processors are in the system.
  • Job characteristics
  • Precedence constraints are given by a precedence
    graph
  • Preemption is not allowed
  • The release time of all the jobs is 0.
  • Objective function
  • Cmax the time the last job finishes
    execution.
  • ( If cj denotes the finishing time of Jj in
    a schedule S,
  • )

13
Gantt Chart
A Gantt chart indicates the time each job spends
in execution, as well as the processor on which
it executes
14
Overview of the paper
  • Introduction
  • Classification
  • Complexity results for Pm prec, pj 1 Cmax
  • Algorithms for Pk prec, pj 1 Cmax
  • Future research directions and conclusions

15
Classification
  • Due to the number of processors
  • Number of processors is a variable (m)
  • Pm prec, pj 1 Cmax
  • Number of processors is a constant (k)
  • Pk prec, pj 1 Cmax

16
Classification
  • Due to the number of processors
  • Number of processors is a variable (m)
  • Pm prec, pj 1 Cmax
  • Number of processors is a constant (k)
  • Pk prec, pj 1 Cmax

17
Pm prec, pj 1 Cmax (1)
  • Theorem 1
  • Pm prec, pj 1 Cmax is NP-complete.
  • Ullman (1976)
  • 3SAT Pm prec, pj 1 Cmax
  • 2. Lenstra and Rinooy Kan (1978)
  • k-clique Pm prec, pj 1 Cmax

Corollary 1.1 The problem of determining the
existence of a schedule with Cmax 3 for the
problem Pm prec, pj 1 Cmax is NP-complete.
18
Pm prec, pj 1 Cmax (2)
  • Mayr (1985)
  • Theorem 2
  • Pm pj 1, SP Cmax is NP-complete.
  • SP Series - parallel
  • Theorem 3
  • Pm pj 1, OF Cmax is NP-complete.
  • OF Opposing - forest

19
SP and OF
  • Series-parallel orders
  • Does NOT have a substructure isomorphic to
    Fig 1.
  • Opposing-forest orders
  • Is a disjoint union of in-tree orders and
    out-tree orders.

Fig 2 Opposing forest
Fig 1
20
Conclusion on Pm prec, pj 1 Cmax
  • 3SAT is reducible to the corresponding scheduling
    problem.
  • m is a function of the number of clauses in the
    3SAT problem.
  • Results and techniques do not hold for the case
    Pk prec, pj 1 Cmax

21
Classification
  • Number of processors is a variable (m)
  • Pm prec, pj 1 Cmax
  • Number of processors is a constant (k)
  • Pk prec, pj 1 Cmax

22
Optimal Schedule for Pk prec, pj 1 Cmax
  • The complexity of Pk prec, pj 1 Cmax is
    open.
  • 8th problem in Garey and Johnsons open problems
    list.(1979)
  • One of the three problems remaining unsolved in
    that list
  • If k 2, P2 prec, pj 1 Cmax is solvable
    in polynomial time.
  • Fujii, Kasami and Ninomiya (1969)
  • Coffman and Graham (1972)
  • For any fixed k, when the precedence graph is
    restricted to certain special forms, Pk prec,
    pj 1 Cmax turns out to be solvable in
    polynomial time.
  • In-tree, Out-tree, Opposing-forest, Interval
    orders

23
Special precedence constraints
  • In-tree (Out-tree)
  • In-forest (Out-forest)
  • Opposing forest
  • Interval orders
  • Quasi-interval orders
  • Over-interval orders
  • Level orders

24
Algorithms for Pk prec, pj 1 Cmax
  • List scheduling policies
  • Grahams list algorithm
  • HLF algorithm
  • MSF algorithm
  • CG algorithm
  • FKN algorithm (Matching algorithm)
  • Merge algorithm

25
Algorithms for Pk prec, pj 1 Cmax
  • List scheduling policies
  • Grahams list algorithm
  • HLF algorithm
  • MSF algorithm
  • CG algorithm
  • FKN algorithm (Matching algorithm)
  • Merge algorithm

26
List scheduling policies (1)
  • Set up a priority list L of jobs.
  • When a processor is idle, assign the first ready
    job to the processor and remove it from the list
    L.

27
Algorithms for Pk prec, pj 1 Cmax
  • List scheduling policies
  • Grahams list algorithm
  • HLF algorithm
  • MSF algorithm
  • CG algorithm
  • FKN algorithm (Matching algorithm)
  • Merge algorithm

28
Grahams list algorithm
  • Graham first analyzed the performance of the
    simplest list scheduling algorithm.
  • List scheduling algorithm with an arbitrary job
    list is called Grahams list algorithm.
  • Approximation ratio for Pk prec, pj 1 Cmax
  • d 2-1/k. (Tight!)
  • Approximation ratio is d if for each input
    instance, the makespan produced by the
    algorithm is at most d times of the optimal
    makespan.

29
Algorithms for Pk prec, pj 1 Cmax
  • List scheduling policies
  • Grahams list algorithm
  • HLF algorithm
  • MSF algorithm
  • CG algorithm
  • FKN algorithm (Matching algorithm)
  • Merge algorithm

30
HLF algorithm (1)
  • T. C. Hu (1961)
  • Critical Path algorithm or Hus algorithm
  • Algorithm
  • Assign a level h to each job.
  • If job has no successors, h(j) equals 1.
  • Otherwise, h(j) equals one plus the maximum level
    of its immediate successors.
  • Set up a priority list L by nonincreasing order
    of the jobs levels.
  • Execute the list scheduling policy on this level
    based priority list L.

31
HLF algorithm (2)
3
3
3
3
2
2
1
1
1
1
1
1
1
32
HLF algorithm (3)
  • Time complexity
  • O(VE) (V is the number of jobs and
    E is the number of edges in the precedence
    graph)
  • Theorem 4 (Hu, 1961)
  • The HLF algorithm is optimal for Pk pj 1 ,
    in-tree (out-tree) Cmax.
  • Corollary 4.1
  • The HLF algorithm is optimal for Pk pj 1
    , in-forest (out-forest) Cmax.

33
HLF algorithm (4)
  • N.F. Chen C.L. Liu (1975)
  • The approximation ratio of HLF algorithm for
    the problem with general precedence constraints
  • If k 2, dHLF 4/3.
  • If k 3, dHLF 2 1/(k-1).

Tight!
34
Algorithms for Pk prec, pj 1 Cmax
  • List scheduling policies
  • Grahams list algorithm
  • HLF algorithm
  • MSF algorithm
  • CG algorithm
  • FKN algorithm (Matching algorithm)
  • Merge algorithm

35
MSF algorithm (1)
  • Algorithm
  • Set up a priority list L by nonincreasing order
    of the jobs successors numbers.
  • (i.e. the job having more
    successors should have
  • a higher priority in L than the job
    having fewer
  • successors)
  • Execute the list scheduling policy based on this
    priority list L.

36
MSF algorithm (2)
7
2
2
2
1
6
0
0
0
0
0
0
0
7 6 2 2 2 1 0 0 0 0
0 0 0
37
MSF algorithm (3)
  • Time complexity
  • O(VE)
  • Theorem 5 (Papadimitriou and Yannakakis, 1979)
  • The MSF algorithm is optimal for Pk pj 1,
    interval Cmax.
  • Theorem 6 (Moukrim, 1999)
  • The MSF algorithm is optimal for Pk pj 1,
    quasi-interval Cmax.

38
Special precedence constraints
  • Interval orders
  • Does NOT have a substructure isomorphic to
    Fig 1.
  • Quasi interval orders
  • Does NOT have a substructure isomorphic to
    TYPE I, II or III.

Type II
Type III
Fig 1
Type I
39
MSF algorithm (4)
  • Ibarra Kim (1976)
  • The performance of MSF algorithm
  • If k 2, dMSF 4/3, and this bound is
    tight.
  • If k 3, no tight bound is known.
  • dMSF is at least 2-1/(k1).

40
MSF algorithm (5)
41
Algorithms for Pk prec, pj 1 Cmax
  • List scheduling policies
  • Grahams list algorithm
  • HLF algorithm
  • MSF algorithm
  • CG algorithm
  • FKN algorithm (Matching algorithm)
  • Merge algorithm

42
CG algorithm (1)
  • Coffman and Graham (1972)
  • An optimal algorithm for P2 prec, pj 1
    Cmax.
  • Best approximation algorithm known for Pk
    prec, pj 1 Cmax, where k 3.

43
CG algorithm (2)
  • Definitions
  • Let IS(Jj) denote the immediate successors set of
    Jj.
  • A job is ready to label, if all its immediate
    successors are labeled and it has not been
    labeled yet.
  • N(Jj) denotes the decreasing sequence of integers
    formed by ordering of the set label (Ji) Ji ?
    IS(Jj) .
  • Let label(Jj) be an integer label assigned to Jj.

44
CG algorithm (3)
  • Assign a label to each job
  • Choose an arbitrary task Jk such that IS(Jk) 0,
    and define label(Jk) to be 1
  • for i ? 2 to n do
  • R be the set of jobs that are ready to label.
  • Let J be the task in R such that N(J) is
    lexicographically smaller than N(J) for all J in
    R
  • Let label(J) ? i
  • end for
  • 2. Construct a list of jobs L Jn, Jn-1,, J2,
    J1 according to the decreasing order of the
    labels of the jobs.
  • 3. Execute the list scheduling policy on this
    priority list L.

45
CG algorithm (4)
N(J10) N(J11)N(J12)(8)
13
10
11
12
N(J8)(7,6,5,4,3,2)
8
9
N(J9)(1)
1
3
4
5
6
7
2
46
CG algorithm (4)
13
10
11
12
8
9
1
3
4
5
6
7
2
13 12 11 10 9 8 7 6 5 4
3 2 1
47
CG algorithm (5)
  • Time complexity
  • O(VE)
  • Theorem 5 (Coffman and Graham, 1972)
  • The CG algorithm is optimal for P2 prec, pj
    1 Cmax.
  • Theorem 6 (Moukrim, 2005)
  • The CG algorithm is optimal for Pk pj 1,
    over-interval Cmax.

48
Special precedence constraints
  • Quasi interval orders
  • Does NOT have a substructure isomorphic to TYPE
    I, II or III.
  • Over interval orders
  • Does NOT have a substructure isomorphic to TYPE I
    or II.

Type II
Type I
Type III
49
CG algorithm (6)
  • The performance of CG algorithm when k3
  • Lam and Sethi (1978)
  • dCG 2 2/k
  • Braschi and Trystram (1994)
  • Cmax(S) (2 2/k) Cmax(S) (k 2
    odd(k))/k (tight!)
  • Note S is a CG schedule.
  • S is an optimal schedule.
  • If k is an odd, odd(k)1
    otherwise, odd(k)0.

50
List Scheduling Policy Conclusions
  • List scheduling policies
  • Grahams list algorithm
  • HLF algorithm
  • MSF algorithm
  • CG algorithm
  • Property
  • easy to implement
  • extended to the problem Pm prec, pj 1 Cmax
  • Research directions
  • Allow priority lists to depend on the number k
    of processors.

51
Algorithms for Pk prec, pj 1 Cmax
  • List scheduling policy algorithm
  • Grahams list algorithm
  • HLF algorithm
  • MSF algorithm
  • CG algorithm
  • FKN algorithm (Matching algorithm)
  • Merge algorithm

52
FKN algorithm (1)
  • Fujii, Kasami and Ninomiya (1969)
  • First optimal algorithm for P2 prec, pj 1
    Cmax.
  • Basic Idea
  • Find a minimum partition of the jobs
  • There are at most two jobs in each set.
  • The pair of jobs in the same set can be
    executed together.
  • Make a valid schedule according to a particular
    order of the partition. (Some clever swap work
    needed!)
  • The length of the result schedule value of the
    min partition
  • Can be solved by some maximum matching algorithm.

53
FKN algorithm (2)
  • Hard to extend!
  • FKN algorithm cannot be extended to k 3
    directly.

Minimal partition is J1, J5, J6 J4, J2, J3
and P2. However, The optimal Cmax corresponds
partition J1, J4 J2,J3,J5 J6
54
Algorithms for Pk prec, pj 1 Cmax
  • List scheduling policies
  • Grahams list algorithm
  • HLF algorithm
  • MSF algorithm
  • CG algorithm
  • FKN algorithm (Matching algorithm)
  • Merge algorithm

55
Merge Algorithm (1)
  • Dolev and Marmuth (1985)
  • Required input
  • An optimal schedule S for a high-graph H(G).
  • Merge algorithms show how to Merge the known
    optimal schedule S with the remaining jobs.
  • Produce an optimal schedule for the whole job
    set G.

56
Merge Algorithm (2)
  • Definitions
  • height h(G)
  • highest level of the vertices in G.
  • median µ(G)
  • height of kth highest component 1
  • high-graph H(G)
  • a subgraph of G, made up of all the
    components which are strictly higher than the
    median.
  • low-graph L(G)
  • remaining subgraph of G, except for H(G)

57
Merge Algorithm (3)
K3
3rd highest
C1
C2
C3
C4
J1
J2
J12
J8
J3
J15
J4
J9
J13
µ(G)4
J16
J10
J11
J14
J5 J6 J7
H(C1)5
H(C2)3
H(C3)3
H(C4)2
H(G)
L(G)
58
Merge Algorithm (4)
  • Idea of the Merge Algorithm
  • If there is an idle period in S, then fill it
    with a highest initial vertices in L(G), and
    remove it from L(G)
  • Similar to HLF Algorithm

59
Merging Algorithm (5)
S Optimal schedule for H(G)
J8
J12
J15
J9
J13
J16
J10
J11
J14
S from merge algorithm
L(G)
60
Merge Algorithm (6)
  • Theorem 10 (Reduction theorem)
  • Let G be a precedence graph and S be an
    optimal schedule for H(G). Then, the Merge
    algorithm finds an optimal schedule for the whole
    graph G in time and space O(VE).
  • Corollary 10
  • If H(G) is empty, then HLF is optimal for G.

61
Merge Algorithm (7)
  • Why Merge algorithm is useful?
  • 1. Find an optimal schedule for a subgraph H(G).
  • 2. H(G) contains fewer than k-1 components.

Dolev and Marmuth
Precedence Constraints Time complexity
Level orders O(nk-1)
Opposing forest O(n2k-2logn)
Bounded height O(nh(k-1))
How to use it? 1. If H(G) is easy to solve. 2. If
every closed subgraph of G can be classified into
polynomial number of classes.

62
Algorithms for Pk prec, pj 1 Cmax
  • List scheduling policies
  • Grahams list algorithm
  • HLF algorithm
  • MSF algorithm
  • CG algorithm
  • FKN algorithm (Matching algorithm)
  • Merge algorithm

63
Main results known
L I S T Approximation ratio K2 K3 K3 K3 K3 K3 K3
L I S T Approximation ratio K2 Intree OF Interval Quasi-interval Over-interval Arbitrary
L I S T List 3/2 2-1/k
L I S T HLF 4/3 opt 2-1/(k-1)
L I S T MSF 4/3 opt opt 2-1/(k1)
L I S T CG opt opt opt opt opt 2-2/k
FKN FKN opt
Merge Merge opt
(Opt We can get optimal solution in polynomial
time. )
64
Overview of the paper
  • Introduction
  • Classification
  • Complexity result for Pm prec, pj 1 Cmax
  • Algorithms for Pk prec, pj 1 Cmax
  • Future research directions and conclusions

65
Future research directions (1)
  • Finding a new class of orders which can be solved
    by known algorithms or their generalizations.
  • over-interval (Moukrim, 2005)
  • Find an algorithm with a better approximation
    ratio for the UET scheduling problem.
  • CG algorithm (1972)
  • Ranade (2003)
  • special precedence constraints
    (loosely connected task graphs )
  • d 1.875

66
Future research directions (2)
  • Use the UET multiprocessor scheduling algorithms
    to solve other related scheduling problems
  • CG algorithm is optimal for P2 rj, pj 1
    Cmax.
  • HLF algorithm is optimal for P rj, pj 1,
    outtree Cmax.
  • (Huo and
    Leung, 2005)
  • MSF algorithm is optimal for P pj 1, cm1,
    quasi-interval Cmax.

  • (Moukrim, 2003)
  • CG algorithm is optimal for P2 prmp, pj 1
    Cmax and ?Cj
  • (Coffman,
    Sethuraman and Timkovsky, 2003)
  • The most challenging research task
  • Solve the famous open problem Pk pj 1 Cmax
    (k3).

67
Thank You!
Write a Comment
User Comments (0)
About PowerShow.com