Title: Algorithm Design
19. ????????????????????
- Algorithm Design
- Algorithm Design Techniques
- Practice Problems
2?????????????????????????
- ?????????????????????????????
- ??????????????????
- ??????????????????????????
- ?????????????????
3Algorithm Design
- definition
- requirement
- strategy
- techniques
4Algorithm Design Definition
- Algorithm Definition a method to solve a
problem described as a sequence of steps to be
performed in a specific logical order - Several algorithms for solving the same problem
may exist , based on very different ideas ,
performance (efficiency , cost and speed)
Algorithm ??????? ????????????????????????????????
???????????????????????????????????
???????????????????????????????????????????????
??????????????????????????????
5Algorithm Design Requirement
- Satisfied Requirements
- unambiguousness
- generality
- correctness
- finiteness
6Satisfied Requirements
- unambiguousness
- easier to understand and to program , so
contain fewer bugs. - sometimes simpler algorithm are more efficient
than more complicated algorithm.
7Satisfied Requirements
- generality
- easier to design an algorithm in more general
terms. - handle a range of input that is natural for the
problem.
8Satisfied Requirements
- correctness must be proved by
- 1) Mathematical Induction.
- 2) Testing with sample input data that possibly
prove the algorithm failure or give wrong answer.
9Satisfied Requirements
- finitenessmust concern about
- 1) execution in finite steps.
- 2) termination in finite time.
10Algorithm Design Strategy
- Computational Device
- Solving Decision
- Data Structure
- Efficiency
11Algorithm Design Strategy
- Computational Device
- Von Neumann
- sequential algorithm
- Multiprocessor
- parallel algorithm
- network algorithm
- distributed algorithm
12Algorithm Design Strategy
- Solving Decision
- choose between solving the problem with
approximation or exact algorithm - approximation algorithm
- - square roots, integrals
- - shortest path
13Algorithm Design Strategy
- Solving Decision (cont.)
- choose to solve the problem with non-recursive or
recursive algorithm - recursion is easy to program, but uses a large
number of function calls that affect to execution
efficiency.
14Algorithm Design Strategy
- Data Structure
- to solve problems easier, we need to use
appropriate data structure. - - student id int or string ?
- - matrix 10x5 array 2D or 50 var.?
- - graph, tree array or linked list ?
15Algorithm Design Strategy
- Efficiency
- time how fast the algor. runs
- space how much extra memory the algor. needs
- worst / best / average case
- - sequential search n / 1 / n/2
16Algorithm Design Strategy
- Efficiency (cont.)
- Order of Growth for Input size
- - when input size is large, how is the run
time ? - - order of growth O (big oh)
- - input size n
17Algorithm Design Strategy
- Efficiency (cont.)
- - O(n2) n 10 ? running time 100
- n 100 ? running time 10,000
- - O(2n) n 10 ? running time 1,024
- - O(log2n) n 10 ? running time 3.3
- n 100 ? running time 6.6
18Algorithm Design Techniques
- To provide guidance for designing algorithms for
new problems - To make it possible to classify algorithms
according to design idea
19Algorithm Design Techniques
- No any general technique can solve all problems
- e.g. Unsorted data cannot use with Binary search
algorithm
20Algorithm Design Techniques
- An algorithm design technique (or strategy or
paradigm) is a general approach to solving
problems algorithmically that is applicable to a
variety of problems from different areas of
computing.
21Algorithm Design Techniques
- Greedy Method
- Divide and Conquer
- Decrease and Conquer / Prune-and-Search
- Transform and Conquer
22Algorithm Design Techniques
- Dynamic Programming
- Randomized Algorithms
- Backtracking Algorithms
23Motto Today
??????????? ??????????????? ??????????????
??????????????? ??????????? ?????????????????
24Greedy Method
- take what you can get now
- make a decision that appears to be good (close
to optimal solution) - proceed by searching a sequence of choices
iteratively to decide the (seem to be) best
solution
Greedy Algorithms ??????? ????????????????????????
??????????????????????????????????????????????????
???????????????????????? ????????????? Greedy
Algorithms ???????????????????????????????????????
??????????
25Greedy Method
- ?????????????????? Greedy
- Coin Changing
- Fractional Knapsack
- Bin Packing
- Task Scheduling
26Greedy Method
- ?????????????????? Greedy
- Prims
- Kruskals
- Dijkstras
- Huffman Code
graph
tree
27Greedy Method
- Coin Changing
- ????????????????????????????????????????? 10
???, ?????? 5 ???, ????????? 1 ???????????????????
????????????? 89 ??? ???????????????????????? 10
??? 8 ??????, 5 ??? 1??????, 1 ??? 4
???????????????????????????????????????
??????????????????????????????? ????????????? 89
??? ????????? (?????? 10 ??? 8 ?????? )
????????????????????? 89 ??? ????????? 9 ???
??????????????????????????????????????????????????
?? 9 ??? ????????????(?????? 5 ??? 1 ??????)
?????????????????? 9 ??? ?????????????? 4 ???
?????????????????????(?????? 1 ?????? 4 ??????)
28Greedy Method
- Fractional Knapsack
- ????????? n ?????????????????????(i)
??????????????? xi ???? ?????????????? bi
???????????? wi - ?????????????????????????????????????????????????
????????????????????? W ???????? - ????????????????????????????????????????
(vibi/wi) ??????????????????????????????? W ????
29Greedy Method
- ???????????? ????? 4 ?????????
- ??????? 4 ???? ?? b110 ??? w10.6 (v116.7)
- ??? 2 ????? ?? b27 ??? w20.4 (v217.5)
- ??? 2 ??? ?? b35 ??? w30.5 (v310)
- ???????? 8 ???? ?? b43 ??? w40.2 (v415)
- ??????????????????????? 4 ????????
- ????? ??? 2 ????? ??????? 4 ???? ??????????? 4
???? - ?????????????????????????????????????? 4 ????????
30Greedy Method
- Bin Packing
- given N items of sizes s1 , s2 , , sN
- while 0 lt si lt 1
- find a solution to pack these items in the
fewest number of bins - 2 algor. versions
- on-line an item must be placed in a bin before
the next item is read - off-line all item list are read in a bin
31- Optimal Bin Packing solution
- given an item list with sizes
- 0.2 , 0.5 , 0.4 , 0.7 , 0.1 , 0.3 , 0.8
0.8
0.3
0.5
0.7
0.1
0.4
0.2
Bin 1 Bin 2 Bin 3
32- Bin Packing strategy
- Next fit fill items in a bin until the next
item cant fit , then insert a new bin (never
look back) 0.2 , 0.5 , 0.4 , 0.7 , 0.1 , 0.3 ,
0.8
empty
empty
empty
empty
empty
0.1
0.8
0.5
0.7
0.4
0.3
0.2
Bin 1 Bin 2 Bin 3 Bin 4
Bin 5
33- Bin Packing strategy
- First fit fill items in a bin , but if any
first previous bin can fit the next item then we
can fill in until no any bin can fit , then
insert a new bin 0.2 , 0.5 , 0.4 , 0.7 , 0.1 ,
0.3 , 0.8
empty
empty
empty
empty
0.1
0.8
0.3
0.5
0.7
0.4
0.2
Bin 1 Bin 2 Bin 3 Bin 4
34- Bin Packing strategy
- Best fit fill items in a bin by trying to
place the new item in the bin that left the
smallest space 0.2 , 0.5 , 0.4 , 0.7 , 0.1 , 0.3
, 0.8
0.3
empty
empty
empty
0.1
0.8
0.7
0.5
0.4
0.2
Bin 1 Bin 2 Bin 3 Bin 4
35Class Exercise
- ??????????????????????????????????????????????????
???????????????????????? 1 ???? ½ ???? ¼ ????
??????????????????????????????????????????????????
??????????????? ?????????????????????
??????????????????????????????????????????????????
?????????????????????????????????????? - ???????? ???????????? 5 ?? ???????? 1 ½ ¼
½ ¼ ???????????????????????? 3 ???? - ??????????????????????????????????????????????????
?????????????????????????????????????????
36Greedy Method
- Task Scheduling
- ???????? n ??????? (?????????,?????????)
??????????????????????????? ??????????????????????
??????????????????????????????????????
??????????????????????????????????????????????????
??? - ??????????? (1,3) , (1,4) , (2,5) , (3,7) , (4,7)
, (6,9) , (7,8) ????????????? ?????????? 1
(1,3) , (3,7) , (7,8) ?????????? 2 (1,4) ,
(4,7) ????????????? 3 (2,5) , (6,9)
37Greedy Method
- Job Scheduling (Uniprocessor)
j1
j2
j3
j4
Job Time j1 15 j2 8 j3
3 j4 10
0 15 23 26
36
First-come-First-serve avg. completion time
25 avg. waiting
time 16
j3
j2
j4
j1
0 3 11 21
36
Shortest Job First avg. completion time
17.75 avg. waiting
time 8.75
38Greedy Method
- Job Scheduling (Multiprocessor)
FCFS
Job Time j1 3 j2 5 j3
6 j4 10 j5 11 j6
14 j7 15 j8 18 j9 20
j1
j4
j7
0 3 13
28
j2
j5
j8
0 5 16
34
j3
j6
j9
0 6 20
40
39Greedy Method
- Job Scheduling (Multiprocessor)
Optimal 1
Job Time j1 3 j2 5 j3
6 j4 10 j5 11 j6
14 j7 15 j8 18 j9 20
j1
j6
j7
0 3 17
32
j2
j5
j8
0 5 16
34
j3
j4
j9
0 6 16
36
40Greedy Method
- Job Scheduling (Multiprocessor)
Job Time j1 3 j2 5 j3
6 j4 10 j5 11 j6
14 j7 15 j8 18 j9 20
j2
j5
j8
0 5 16 34
j6
j9
0 14
34
j1
j3
j4
j7
0 3 9 19
34
Optimal 2 minimize completion time
41Divide and Conquer
- divide break a given problem into subproblems
- recur try to solve each in recursive way
- conquer derive the final solution from all
solutions
42Divide and Conquer
Problem of size n
Subproblem m of size n/m
Subproblem 1 of size n/m
Subproblem 2 of size n/m
Solution to Subproblem 1
Solution to Subproblem 2
Solution to Subproblem m
43Divide and Conquer
- Merge sort
- Quick sort
- Binary Tree Traversal
- Closest-Pair and Convex-Hall
- Selection problem
44Divide and Conquer
- Factorial
- Fibonacci
- Binary search
- Strassens Matrix Multiplication
- Big Integer Multiplication
45Divide and Conquer
- Factorial
- n! n (n-1)!
- (n-1)! (n-1) (n-2)!
-
- 1! 1
- 0! 1
- ????????
- 4! ?
- 4! 4 3!
- 3! 3 2!
- 2! 2 1!
- 1! 1 0!
- 0! 1
Describe with Tree structure
46Divide and Conquer
- Fibonacci num. 1,1,2,3,5,8,13,
- fibo (n) fibo (n-1) fibo (n-2)
- fibo (n-1) fibo (n-2) fibo (n-3)
- fibo (n-2) fibo (n-3) fibo (n-4)
-
- fibo (3) fibo (2) fibo (1)
- fibo (2) 1
- fibo (1) 1
47Divide and Conquer
Search 37
12 15 18 23 26 37 39 41
43 48 12 15 18 23 26 37 39
41 43 48 12 15 18 23 26
37 39 41 43 48
48Divide and Conquer
- Strassens Matrix Multiplication
- Z X Y matrix n x n
-
- I AxE BxG , J AxF BxH
- K CxE DxG , L CxF DxH
x
49Divide and Conquer
- Big Integer Multiplication
- multiply 2 N-digit numbers X , Y
- XY XLYL10N (XLYR XRYL)10N/2 XRYR
- XLYRXRYL (XL-XR)(YR-YL) XLYL XRYR
- require 2 subtraction , 3 multiplication
- D1 XL-XR , D2 YR-YL
- XLYL , XRYR , D1D2
- D3 D1D2 XLYL XRYR
50Divide and Conquer
- Big Integer Multiplication
- X 61,438,521 Y 94,736,407
- XL 6143 , XR 8521
- YL 9473 , YR 6407
- D1 XL-XR -2378, D2 YR-YL -3066
- XLYL 58192639 , XRYR 54594047
- D1D2 7290948
- D3 D1D2 XLYL XRYR 120077634
- XY XLYL108 D3104 XRYR
51Motto Today
?????????????????????????? ??? ?????????????
??? ???????????????? ??????? ??????????????????
????????????????
52Decrease and Conquer
- based on exploiting the relationship between a
solution to a given instance of a problem and a
solution to a smaller instance of the same
problem. - it can be exploited either top down
(recursively) or bottom up (without a recursion)
.
53Decrease and Conquer
- 3 major variations
- decrease-by-a-constant
- decrease-by-a-constant-factor
- variable-size-decrease
From Anany
54Decrease and Conquer
- decrease-by-a-constant
- the size of instance is reduced by the same
constant on each iteration - decrease-by-a-constant-factor
- the size of instance is reduced by the same
constant factor on each iteration
55Decrease-by-a-constant
- Insertion sort
- Depth-First search and Breadth-First search
(graph) - Topological sorting (graph)
- Generating Combinatorial Objects
56Decrease-by-a-constant
- Insertion sort
- use the decrease-by-one
- sorted-side unsorted-side
- the size of unsorted data is reduced by 1 on
each loop
57Insertion Sort
43 22 80 17 36 16 29 22 43
80 17 36 16 29 22 43 80 17
36 16 29 17 22 43 80 36 16
29 17 22 36 43 80 16 29 16
17 22 36 43 80 29 16 17 22
29 36 43 80
58Decrease-by-a-constant-factor
- Jasephus Problem
- Fake-Coin problem
- Multiplication à la Russe
59Decrease-by-a-constant-factor
- Josephus problem
- to determine the survivor by eliminating every
second person (stand in circle) until only one
survivor is left - e.g. J(6) 5 , J(7) 7 , J(9) 3
- use the decrease-by-half (2)
60Josephus problem
- use the decrease-by-half (2)
- can consider J(n)
- if n is even (n2k) ,
- J(2k) 2 J(k) -1
- if n is odd (n2k1) ,
- J(2k1) 2 J(k) 1
- J(6) 2 J(3) -1 , J(3) 2 J(1) 1 ,
- J(1) 1 ? J(3) 3 ? J(6) 5
61Josephus problem
- use the decrease-by-half (2)
- can be obtained by rotate left 1 bit
- J(6) J(1102) ? 1012 5
- J(7) J(1112) ? 1112 7
- J(9) J(10012) ? 00112 3
62Josephus problem
- use the decrease-by-3 (3)
- eliminate every 3 person
- J(6) 1
- J(7) 4
- J(9) 1
63Decrease and Conquer
- 3 major variations
- decrease-by-a-constant
- decrease-by-a-constant-factor
- variable-size-decrease
- a size reduction pattern varies from one
iteration to another
64Variable-size-decrease
- Euclid algor.
- Computing a median and the selection problem
- Interpolation search
- Binary search tree
65Decrease and Conquer
- Euclidean algor.
- finding gcd(a,b) recursively
- a ,if b0
- gcd(a,b) b ,if a0
- gcd (b,ab) ,otherwise
-
- e.g. gcd(124,40) gcd(40,4)
- 4
66Transform and Conquer
Problems instance
Change to Another problem instance
Simpler instance
Change representation
solution
67Transform and Conquer
- Horners Rule
- Presorting
- Gaussian Elimination
- Balanced Search tree
- Heap sort (tree)
- Problem Reduction
68Transform and Conquer
- Horners rules
- p(x) anxn an-1xn-1 a1x1 a0
- use the representation change technique ?
- p(x) ((anx an-1)x )x a
- e.g. p(x) 2x4 x3 3x2 x 5
- x(x(x(2x1) 3) 1) 5
69Transform and Conquer
- Problem Reduction
- reduce the problem to another problem that
solving algor. is known - Problem1 ? Problem2 ? Solution
- Problem counting paths in a graph , linear
programming , the Least Common Multiple (lcm)
70Transform and Conquer
- the Least Common Multiple (lcm)
- lcm(24,60) 2 2 3 2 5 120
- 24 2 2 2 3
- 60 2 2 3 5
- lcm(11,5) 55
71Transform and Conquer
- the Least Common Multiple (lcm)
- lcm (m,n)
- lcm(24,60) 120 ? gcd(24,60) 12
- lcm(11,5) 55 ? gcd(11,5) 1
72Dynamic Programming
- Characterizing subproblems using a small set of
integer indices to allow an optimal solution to
a subproblem to be defined by the combination of
solutions to even smaller subproblems
73Dynamic Programming
- Binomial Coefficient
- Warshalls Floyds ? Directed Graph
- Optimal Binary Search tree ? Tree
- Ordering Matrix Multiplication or Matrix
Chain-Product (ABCD ? A(BC)D , (AB)(CD)) - All-pair Shortest Path ? Directed Graph
74Dynamic Programming
- Solving each of smaller subproblems only once and
recording the results in a table from which can
obtain a solution to the original problem - Using a table instead of recursion
- Factorial
- Fibonacci numbers
75Dynamic Programming
- 0-1 Knapsack problem
- 0-1 reject or accept
- given n items of weights w1 , w2 ,, wn and
values v1 , v2 ,, vn and a knapsack of capacity
W - find the best solution that gives maximum weight
and value
760-1 Knapsack problem
- use table to fill in by applying formulas
- Bk-1,w , if wk gt w
- B k,w
- maxBk-1,w,Bk-1,w-wkbk
, else - e.g. let W 5 and
- data (item,weight,value)
- ? (1,2,12) (2,1,10) (3,3,20) (4,2,15)
770-1 Knapsack problem
- capacity
j (W5) - weight, value i 1 2 3
4 5 - w12, v112 1 0 12 12 12
12 - w21, v210 2 10 12 22 22
22 - w33, v320 3 10 12 22 30
32 - w42, v415 4 10 15 25 30
37
Select 1, 2, 4
78Dynamic Programming
- Ordering Matrix Multiplication or Matrix
Chain-Product - find the best solution that gives minimum times
of multiplication - e.g. ABCD , A(BC)D , (AB)(CD)
79 No ego Help each other then Everyone
will succeed
80Randomized Algorithms
- a random number is used to make a decision in
some situation - e.g. giving a quiz by using a coin
- a good randomized algor. has no bad inputs and
no relative to the particular input , e.g. data
testing
81Randomized Algorithms
- Random Number Generator
- a method to generate true randomness is
impossible to do on computer since numbers ,
called pseudorandom numbers, will depend on the
algorithm - e.g. linear congruential generator
- xi1 Axi M , given seed (x0) value
82Randomized Algorithms
- Skip Lists (in searching insertion)
- e.g. use random number generator to determine
which node to be traversed within the expected
time - Primality Testing
- e.g. to determine a large N-digit number is
prime or not by factoring into N/2-digit primes,
then a random generator is needed
83Backtracking Algorithms
- there are many possibilities to try, if not
succeed then step back to try another way - the elimination of a large group of possibilities
in one step is known as pruning - e.g. arranging furniture in a new house , never
place sofa, bed closet in the kitchen
84Backtracking Algorithms
- n-Queens
- Hamiltonian Circuit
- Subset-Sum
- Goal Seeking
- Turnpike Reconstruction
85Backtracking Algorithms
- Games
- chess n-Queens
- checker Tic-Tac-Toe
- Minimax strategy
86Practice Problems
- When confronted with a problem, it is worthwhile
to see if any method of algor. design can apply
properly together with data structure that will
lead to efficient solution.
87Practice Problems
- Knapsack problem
- Bin packing
- Task scheduling
- Strassens matrix multiplication
- Big Integer multiplication
- Josephus problem
- an
Greedy
Divide conquer
Decreaseconquer
88Practice Problems
- Non-recursive
- an, fibonacci
- Recursive exponentiation
- an , Horners Rule
- LCM GCD Euclid Algor.
- 0-1 Knapsack Problem
Dynamic programming
transform
89Motto Today
??????????? ????????????????????? ???????????
????????????????
90Class Exercises
- Fractional Knapsack W 5
- ??????? 4 ???? ?? b110 ??? w10.5
- ??? 3 ????? ?? b25 ??? w20.4
- ??? 2 ??? ?? b37 ??? w30.5
- ???????? 1 ??????? ?? b415 ??? w41.8
- Bin Packing
- 0.25 , 0.5 , 1.0 , 0.75 , 0.125 , 0.25 ,
0.5
91Class Exercises
- Task Scheduling act start finish
- a1 1 4 , a2 2 5 , a3 1 5 , a4 5 8
- a5 4 6 , a6 6 10 , a7 7 9
- Big Integer Multiplication
- X 4,123,450,732
- Y 8,159,324,570
- an ? 1321 (decrease-by-2)
- Josephus Problem J(82) decrease-by-3
92Class Exercises
- Euclidean
- gcd (2039,113) , gcd (1548,204)
- Horners Rule
- p(x) 7x9 4x6 - 3x5 - x4 5x2 9
- Least Common Multiple
- lcm (2039,113) , lcm (1548,204)
- 0-1 knapsack Problem
- W 6 , (1,2,12) (2,1,10) (3,4,20) (4,3,15)
(5,2,14)