CS 3343: Analysis of Algorithms - PowerPoint PPT Presentation

About This Presentation
Title:

CS 3343: Analysis of Algorithms

Description:

CS 3343: Analysis of Algorithms Review for Exam 2 Exam 2 Closed book exam One cheat sheet allowed (limit to a single page of letter-size paper, double-sided) Tuesday ... – PowerPoint PPT presentation

Number of Views:165
Avg rating:3.0/5.0
Slides: 145
Provided by: Jian138
Learn more at: http://www.cs.utsa.edu
Category:

less

Transcript and Presenter's Notes

Title: CS 3343: Analysis of Algorithms


1
CS 3343 Analysis of Algorithms
  • Review for Exam 2

2
Exam 2
  • Closed book exam
  • One cheat sheet allowed (limit to a single page
    of letter-size paper, double-sided)
  • Tuesday, April 15, 1000 1125pm
  • Basic calculator (no graphing) is allowed but not
    necessary

3
Materials covered
  • Quick Sort
  • Heap sort, priority queue
  • Linear time sorting algorithms
  • Order statistics
  • Dynamic programming
  • Greedy algorithm
  • Questions will be similar to homework / quizzes
  • Familiar with the algorithm procedure
  • Some analysis of time/space complexity
  • One or two problems on algorithm design

4
Quick sort
  • Quicksort an n-element array
  • Divide Partition the array into two subarrays
    around a pivot x such that elements in lower
    subarray x elements in upper subarray.
  • Conquer Recursively sort the two subarrays.
  • Combine Trivial.

Key Linear-time partitioning subroutine.
5
Pseudocode for quicksort
QUICKSORT(A, p, r) if p lt r then q ? PARTITION(A,
p, r) QUICKSORT(A, p, q1) QUICKSORT(A, q1, r)
Initial call QUICKSORT(A, 1, n)
6
Partition Code
  • Partition(A, p, r)
  • x Ap // pivot is the first element
  • i p
  • j r 1
  • while (TRUE)
  • repeat
  • i
  • until Ai gt x i gt j
  • repeat
  • j--
  • until Aj lt x j lt i
  • if (i lt j)
  • Swap (Ai, Aj)
  • else
  • break
  • swap (Ap, Aj)
  • return j

7
p
r
6
10
5
8
13
3
2
11
x 6
i
j
6
10
5
8
13
3
2
11
scan
i
j
6
2
5
8
13
3
10
11
swap
i
j
Partition example
6
2
5
8
13
3
10
11
scan
i
j
6
2
5
3
13
8
10
11
swap
i
j
6
2
5
3
13
8
10
11
scan
i
j
q
p
r
3
2
5
6
13
8
10
11
final swap
8
6
10
5
8
11
3
2
13
Quick sort example
9
Quicksort Runtimes
  • Best case runtime Tbest(n) ? O(n log n)
  • Worst case runtime Tworst(n) ? O(n2)
  • Average case runtime Tavg(n) ? O(n log n)
  • Expected runtime of randomized quicksort is O(n
    log n)

10
Randomized Partition
  • Randomly choose an element as pivot
  • Every time need to do a partition, throw a die to
    decide which element to use as the pivot
  • Each element has 1/n probability to be selected

Rand-Partition(A, p, r) d random() //
draw a random number between 0 and 1 index
p floor((r-p1) d) // pltindexltq
swap(Ap, Aindex) Partition(A, p, r)
// now use Ap as pivot
11
Running time of randomized quicksort
T(0) T(n1) dn if 0 n1 split, T(1)
T(n2) dn if 1 n2 split, M T(n1) T(0)
dn if n1 0 split,
T(n)
  • The expected running time is an average of all
    cases

Expectation
12
  • Fact
  • Need to Prove T(n) c n log (n)
  • Assumption T(k) ck log (k) for 0 k n-1
  • Prove by induction

If c 4
13
Heaps
  • A heap can be seen as a complete binary tree

Perfect binary tree
16
14
10
8
7
9
3
2
4
1
14
Referencing Heap Elements
  • So
  • Parent(i)
  • return ?i/2?
  • Left(i)
  • return 2i
  • right(i)
  • return 2i 1

15
Heap Operations Heapify()
  • Heapify(A, i)
  • // precondition subtrees rooted at l and r are
    heaps
  • l Left(i) r Right(i)
  • if (l lt heap_size(A) Al gt Ai)
  • largest l
  • else
  • largest i
  • if (r lt heap_size(A) Ar gt Alargest)
  • largest r
  • if (largest ! i)
  • Swap(A, i, largest)
  • Heapify(A, largest)
  • // postcondition subtree rooted at i is a heap

Among Al, Ai, Ar, which one is largest?
If violation, fix it.
16
Heapify() Example
16
4
10
14
7
9
3
2
8
1
16
4
10
14
7
9
3
2
8
1
A
17
Heapify() Example
16
4
10
14
7
9
3
2
8
1
16
10
14
7
9
3
2
8
1
A
4
18
Heapify() Example
16
4
10
14
7
9
3
2
8
1
16
10
7
9
3
2
8
1
A
4
14
19
Heapify() Example
16
14
10
4
7
9
3
2
8
1
16
14
10
7
9
3
2
8
1
A
4
20
Heapify() Example
16
14
10
4
7
9
3
2
8
1
16
14
10
7
9
3
2
1
A
4
8
21
Heapify() Example
16
14
10
8
7
9
3
2
4
1
16
14
10
8
7
9
3
2
1
A
4
22
Heapify() Example
16
14
10
8
7
9
3
2
4
1
16
14
10
8
7
9
3
2
4
1
A
23
BuildHeap()
  • // given an unsorted array A, make A a heap
  • BuildHeap(A)
  • heap_size(A) length(A)
  • for (i ?lengthA/2? downto 1)
  • Heapify(A, i)

24
BuildHeap() Example
  • Work through exampleA 4, 1, 3, 2, 16, 9, 10,
    14, 8, 7

4
1
3
2
16
9
10
14
8
7
4
1
3
2
16
9
10
14
8
7
A
25
4
1
3
2
16
9
10
14
8
7
4
1
3
2
16
9
10
14
8
7
A
26
4
1
3
2
16
9
10
14
8
7
4
1
3
2
16
9
10
14
8
7
A
27
4
1
3
14
16
9
10
2
8
7
4
1
3
14
16
9
10
2
8
7
A
28
4
1
3
14
16
9
10
2
8
7
4
1
3
14
16
9
10
2
8
7
A
29
4
1
10
14
16
9
3
2
8
7
4
1
10
14
16
9
3
2
8
7
A
30
4
1
10
14
16
9
3
2
8
7
4
1
10
14
16
9
3
2
8
7
A
31
4
16
10
14
1
9
3
2
8
7
4
16
10
14
1
9
3
2
8
7
A
32
4
16
10
14
7
9
3
2
8
1
4
16
10
14
7
9
3
2
8
1
A
33
4
16
10
14
7
9
3
2
8
1
4
16
10
14
7
9
3
2
8
1
A
34
16
4
10
14
7
9
3
2
8
1
16
4
10
14
7
9
3
2
8
1
A
35
16
14
10
4
7
9
3
2
8
1
16
14
10
4
7
9
3
2
8
1
A
36
16
14
10
8
7
9
3
2
4
1
16
14
10
8
7
9
3
2
4
1
A
37
Analyzing BuildHeap() Tight
  • To Heapify() a subtree takes O(h) time where h is
    the height of the subtree
  • h O(lg m), m nodes in subtree
  • The height of most subtrees is small
  • Fact an n-element heap has at most ?n/2h1?
    nodes of height h
  • CLR 6.3 uses this fact to prove that BuildHeap()
    takes O(n) time

38
Heapsort
  • Heapsort(A)
  • BuildHeap(A)
  • for (i length(A) downto 2)
  • Swap(A1, Ai)
  • heap_size(A) - 1
  • Heapify(A, 1)

39
Heapsort Example
  • Work through exampleA 4, 1, 3, 2, 16, 9, 10,
    14, 8, 7

4
1
3
2
16
9
10
14
8
7
4
1
3
2
16
9
10
14
8
7
A
40
Heapsort Example
  • First build a heap

16
14
10
8
7
9
3
2
4
1
16
14
10
8
7
9
3
2
4
1
A
41
Heapsort Example
  • Swap last and first

1
14
10
8
7
9
3
2
4
16
1
14
10
8
7
9
3
2
4
16
A
42
Heapsort Example
  • Last element sorted

1
14
10
8
7
9
3
2
4
16
1
14
10
8
7
9
3
2
4
16
A
43
Heapsort Example
  • Restore heap on remaining unsorted elements

14
8
10
4
7
9
3
2
1
16
Heapify
14
8
10
4
7
9
3
2
1
16
A
44
Heapsort Example
  • Repeat swap new last and first

1
8
10
4
7
9
3
2
14
16
1
8
10
4
7
9
3
2
14
16
A
45
Heapsort Example
  • Restore heap

10
8
9
4
7
1
3
2
14
16
10
8
9
4
7
1
3
2
14
16
A
46
Heapsort Example
  • Repeat

9
8
3
4
7
1
2
10
14
16
9
8
3
4
7
1
2
10
14
16
A
47
Heapsort Example
  • Repeat

8
7
3
4
2
1
9
10
14
16
8
7
3
4
2
1
9
10
14
16
A
48
Heapsort Example
  • Repeat

1
2
3
4
7
8
9
10
14
16
1
2
3
4
7
8
9
10
14
16
A
49
Implementing Priority Queues
  • HeapMaximum(A)
  • return A1

50
Implementing Priority Queues
  • HeapExtractMax(A)
  • if (heap_sizeA lt 1) error
  • max A1
  • A1 Aheap_sizeA
  • heap_sizeA --
  • Heapify(A, 1)
  • return max

51
HeapExtractMax Example
16
14
10
8
7
9
3
2
4
1
16
14
10
8
7
9
3
2
4
1
A
52
HeapExtractMax Example
  • Swap first and last, then remove last

1
14
10
8
7
9
3
2
4
16
14
10
8
7
9
3
2
4
16
A
1
53
HeapExtractMax Example
  • Heapify

14
8
10
4
7
9
3
2
1
16
10
7
9
3
2
16
A
14
8
4
1
54
Implementing Priority Queues
  • HeapChangeKey(A, i, key)
  • if (key Ai) // decrease key
  • Ai key
  • heapify(A, i)
  • else // increase key
  • Ai key
  • while (igt1 Aparent(i)ltAi)
  • swap(Ai, Aparent(i)

Sift down
Bubble up
55
HeapChangeKey Example
  • Increase key

16
14
10
8
7
9
3
2
4
1
16
14
10
8
7
9
3
2
4
1
A
56
HeapChangeKey Example
  • Increase key

16
14
10
15
7
9
3
2
4
1
16
14
10
7
9
3
2
4
1
A
15
57
HeapChangeKey Example
  • Increase key

16
15
10
14
7
9
3
2
4
1
16
10
7
9
3
2
4
1
A
14
15
58
Implementing Priority Queues
  • HeapInsert(A, key)
  • heap_sizeA
  • i heap_sizeA
  • Ai -8
  • HeapChangeKey(A, i, key)

59
HeapInsert Example
  • HeapInsert(A, 17)

16
14
10
8
7
9
3
2
4
1
16
14
10
8
7
9
3
2
4
1
A
60
HeapInsert Example
  • HeapInsert(A, 17)

16
14
10
8
7
9
3
2
4
1
-8
-8 makes it a valid heap
16
14
10
8
7
9
3
2
4
1
A
-8
61
HeapInsert Example
  • HeapInsert(A, 17)

16
14
10
8
7
9
3
2
4
1
17
Now call changeKey
16
10
8
9
3
2
4
1
A
17
14
7
62
HeapInsert Example
  • HeapInsert(A, 17)

17
16
10
8
14
9
3
2
4
1
7
17
10
8
9
3
2
4
1
A
7
16
14
63
Counting sort
for i ? 1 to k do Ci ? 0 for j ? 1 to n do CA
j ? CA j 1 ? Ci key i for i ?
2 to k do Ci ? Ci Ci1 ? Ci key
i for j ? n downto 1 do BCA j ? A
j CA j ? CA j 1
1.
Initialize
2.
Count
3.
Compute running sum
4.
Re-arrange
64
Counting sort
1
2
3
4
5
1
2
3
4
A
4
1
3
4
3
C
1
0
2
2
C'
1
1
3
5
for i ? 2 to k do Ci ? Ci Ci1 ? Ci
key i
3.
65
Loop 4 re-arrange
1
2
3
4
5
1
2
3
4
A
4
1
3
4
3
C
1
1
3
5
B
3
C'
1
1
3
5
for j ? n downto 1 do BCA j ? A j CA
j ? CA j 1
4.
66
Analysis
for i ? 1 to k do Ci ? 0
1.
Q(k)
for j ? 1 to n do CA j ? CA j 1
2.
Q(n)
for i ? 2 to k do Ci ? Ci Ci1
3.
Q(k)
for j ? n downto 1 do BCA j ? A j CA
j ? CA j 1
4.
Q(n)
Q(n k)
67
Stable sorting
Counting sort is a stable sort it preserves the
input order among equal elements.
Why this is important? What other algorithms have
this property?
68
Radix sort
  • Similar to sorting the address books
  • Treat each digit as a key
  • Start from the least significant bit

Least significant
Most significant
198099109123518183599 340199540380128115295 384700
101594539614696 382408360201039258538 614386507628
681328936
69
Time complexity
  • Sort each of the d digits by counting sort
  • Total cost d (n k)
  • k 10
  • Total cost T(dn)
  • Partition the d digits into groups of 3
  • Total cost (n103)d/3
  • We work with binaries rather than decimals
  • Partition a binary number into groups of r bits
  • Total cost (n2r)d/r
  • Choose r log n
  • Total cost dn / log n
  • Compare with dn log n
  • Catch faster than quicksort only when n is very
    large

70
Randomized selection algorithm
RAND-SELECT(A, p, q, i) ? i th smallest of A p .
. q if p q i gt 1 then error! r ?
RAND-PARTITION(A, p, q) k ? r p 1 ? k
rank(Ar) if i k then return A r if i lt k
then return RAND-SELECT( A, p, r 1, i ) else
return RAND-SELECT( A, r 1, q, i k )
71
Complete example select the 6th smallest element.
7
10
5
8
11
3
2
13
i 6
Note here we always used first element as pivot
to do the partition (instead of rand-partition).
72
Running time of randomized selection
T(max(0, n1)) n if 0 n1 split, T(max(1,
n2)) n if 1 n2 split, M T(max(n1, 0))
n if n1 0 split,
T(n)
  • For upper bound, assume ith element always falls
    in larger side of partition
  • The expected running time is an average of all
    cases

Expectation
73
Substitution method
Want to show T(n) O(n). So need to prove T(n)
cn for n gt n0
Assume T(k) ck for all k lt n
if c 4
Therefore, T(n) O(n)
74
Worst-case linear-time selection
Same as RAND-SELECT
75
Developing the recurrence
T(n)
Q(n)
T(n/5)
Q(n)
T(7n/103)
76
Solving the recurrence
Assumption T(k) ck for all k lt n
if n 60
if c 20 and n 60
77
Elements of dynamic programming
  • Optimal sub-structures
  • Optimal solutions to the original problem
    contains optimal solutions to sub-problems
  • Overlapping sub-problems
  • Some sub-problems appear in many solutions

78
Two steps to dynamic programming
  • Formulate the solution as a recurrence relation
    of solutions to subproblems.
  • Specify an order to solve the subproblems so you
    always have what you need.

79
Optimal subpaths
  • Claim if a path start?goal is optimal, any
    sub-path, start?x, or x?goal, or x?y, where x, y
    is on the optimal path, is also the shortest.
  • Proof by contradiction
  • If the subpath between x and y is not the
    shortest, we can replace it with the shorter one,
    which will reduce the total length of the new
    path gt the optimal path from start to goal is
    not the shortest gt contradiction!
  • Hence, the subpath x?y must be the shortest among
    all paths from x to y

80
Dynamic programming illustration
S
9
1
2
3
3
12
13
15
0




5
3
3
3
3
2
5
2
3
6
8
13
15
5
2
3
3
9
3
4
2
3
2
9
7
11
13
16
6
2
3
7
4
6
3
3
3
11
14
17
20
13
4
6
3
1
3
2
3
2
1
17
17
18
20
17
G
F(i-1, j) dist(i-1, j, i, j) F(i, j) min
F(i, j-1) dist(i, j-1, i, j)
81
Trace back
9
1
2
3
3
12
13
15
0




5
3
3
3
3
2
5
2
3
6
8
13
15
5
2
3
3
9
3
4
2
3
2
9
7
11
13
16
6
2
3
7
4
6
3
3
3
11
14
17
20
13
4
6
3
1
3
2
3
2
1
17
17
18
20
17
82
Longest Common Subsequence
  • Given two sequences x1 . . m and y1 . . n,
    find a longest subsequence common to them both.

83
Optimal substructure
  • Notice that the LCS problem has optimal
    substructure parts of the final solution are
    solutions of subproblems.
  • If z LCS(x, y), then any prefix of z is an LCS
    of a prefix of x and a prefix of y.
  • Subproblems find LCS of pairs of prefixes of x
    and y

m
i
x
z
n
y
j
84
Finding length of LCS
m
x
n
y
  • Let ci, j be the length of LCS(x1..i,
    y1..j)
  • gt cm, n is the length of LCS(x, y)
  • If xm yn
  • cm, n cm-1, n-1 1
  • If xm ! yn
  • cm, n max cm-1, n, cm, n-1

85
DP Algorithm
  • Key find out the correct order to solve the
    sub-problems
  • Total number of sub-problems m n

n
0
j


C(i, j)


0
i
m
86
LCS Example (0)
ABCB BDCAB
j 0 1 2 3 4
5
i
Yj
B
B
A
C
D
Xi
0
A
1
B
2
3
C
4
B
X ABCB m X 4 Y BDCAB n Y
5 Allocate array c5,6
87
LCS Example (1)
ABCB BDCAB
j 0 1 2 3 4
5
i
B
B
A
C
D
Yj
Xi
0
0
0
0
0
0
0
A
1
0
B
2
0
3
C
0
4
B
0
for i 1 to m ci,0 0 for j 1 to n
c0,j 0
88
LCS Example (2)
ABCB BDCAB
j 0 1 2 3 4
5
i
B
B
A
C
D
Yj
Xi
0
0
0
0
0
0
0
A
1
0
0
B
2
0
3
C
0
4
B
0
if ( Xi Yj ) ci,j ci-1,j-1 1
else ci,j max( ci-1,j, ci,j-1 )
89
LCS Example (3)
ABCB BDCAB
j 0 1 2 3 4
5
i
B
B
A
C
D
Yj
Xi
0
0
0
0
0
0
0
A
1
0
0
0
0
B
2
0
3
C
0
4
B
0
if ( Xi Yj ) ci,j ci-1,j-1 1
else ci,j max( ci-1,j, ci,j-1 )
90
LCS Example (4)
ABCB BDCAB
j 0 1 2 3 4
5
i
B
B
A
C
D
Yj
Xi
0
0
0
0
0
0
0
A
1
0
0
0
0
1
B
2
0
3
C
0
4
B
0
if ( Xi Yj ) ci,j ci-1,j-1 1
else ci,j max( ci-1,j, ci,j-1 )
91
LCS Example (5)
ABCB BDCAB
j 0 1 2 3 4
5
i
B
B
A
C
D
Yj
Xi
0
0
0
0
0
0
0
A
1
0
0
0
0
1
1
B
2
0
3
C
0
4
B
0
if ( Xi Yj ) ci,j ci-1,j-1 1
else ci,j max( ci-1,j, ci,j-1 )
92
LCS Example (6)
ABCB BDCAB
j 0 1 2 3 4
5
i
B
B
A
C
D
Yj
Xi
0
0
0
0
0
0
0
A
1
0
0
0
1
0
1
B
2
0
1
3
C
0
4
B
0
if ( Xi Yj ) ci,j ci-1,j-1 1
else ci,j max( ci-1,j, ci,j-1 )
93
LCS Example (7)
ABCB BDCAB
j 0 1 2 3 4
5
i
B
B
A
C
D
Yj
Xi
0
0
0
0
0
0
0
A
1
0
1
0
0
0
1
B
2
0
1
1
1
1
3
C
0
4
B
0
if ( Xi Yj ) ci,j ci-1,j-1 1
else ci,j max( ci-1,j, ci,j-1 )
94
LCS Example (8)
ABCB BDCAB
j 0 1 2 3 4
5
i
B
B
A
C
D
Yj
Xi
0
0
0
0
0
0
0
A
1
0
1
0
0
0
1
B
2
0
1
1
1
1
2
3
C
0
4
B
0
if ( Xi Yj ) ci,j ci-1,j-1 1
else ci,j max( ci-1,j, ci,j-1 )
95
LCS Example (9)
ABCB BDCAB
j 0 1 2 3 4
5
i
B
B
A
C
D
Yj
Xi
0
0
0
0
0
0
0
A
1
0
1
0
0
0
1
B
2
0
2
1
1
1
1
3
C
0
1
1
4
B
0
if ( Xi Yj ) ci,j ci-1,j-1
1 else ci,j max( ci-1,j, ci,j-1 )
96
LCS Example (10)
ABCB BDCAB
j 0 1 2 3 4
5
i
B
B
A
C
D
Yj
Xi
0
0
0
0
0
0
0
A
1
0
1
0
0
0
1
B
2
0
1
2
1
1
1
3
C
0
1
1
2
4
B
0
if ( Xi Yj ) ci,j ci-1,j-1 1
else ci,j max( ci-1,j, ci,j-1 )
97
LCS Example (11)
ABCB BDCAB
j 0 1 2 3 4
5
i
B
B
A
C
D
Yj
Xi
0
0
0
0
0
0
0
A
1
0
1
0
0
0
1
B
2
0
1
2
1
1
1
3
C
0
1
1
2
2
2
4
B
0
if ( Xi Yj ) ci,j ci-1,j-1
1 else ci,j max( ci-1,j, ci,j-1 )
98
LCS Example (12)
ABCB BDCAB
j 0 1 2 3 4
5
i
B
B
A
C
D
Yj
Xi
0
0
0
0
0
0
0
A
1
0
1
0
0
0
1
B
2
0
1
2
1
1
1
3
C
0
1
1
2
2
2
4
B
0
1
if ( Xi Yj ) ci,j ci-1,j-1 1
else ci,j max( ci-1,j, ci,j-1 )
99
LCS Example (13)
ABCB BDCAB
j 0 1 2 3 4
5
i
B
B
A
C
D
Yj
Xi
0
0
0
0
0
0
0
A
1
0
1
0
0
0
1
B
2
0
1
2
1
1
1
3
C
0
1
1
2
2
2
4
B
0
1
1
2
2
if ( Xi Yj ) ci,j ci-1,j-1
1 else ci,j max( ci-1,j, ci,j-1 )
100
LCS Example (14)
ABCB BDCAB
j 0 1 2 3 4
5
i
B
B
A
C
D
Yj
Xi
0
0
0
0
0
0
0
A
1
0
1
0
0
0
1
B
2
0
1
2
1
1
1
3
C
0
1
1
2
2
2
3
4
B
0
1
1
2
2
if ( Xi Yj ) ci,j ci-1,j-1 1
else ci,j max( ci-1,j, ci,j-1 )
101
LCS Algorithm Running Time
  • LCS algorithm calculates the values of each entry
    of the array cm,n
  • So what is the running time?

O(mn) since each ci,j is calculated in
constant time, and there are mn elements in the
array
102
How to find actual LCS
  • The algorithm just found the length of LCS, but
    not LCS itself.
  • How to find the actual LCS?
  • For each ci,j we know how it was acquired
  • A match happens only when the first equation is
    taken
  • So we can start from cm,n and go backwards,
    remember xi whenever ci,j ci-1, j-11.

2
2
For example, here ci,j ci-1,j-1 1 213
2
3
103
Finding LCS
j 0 1 2 3 4
5
i
B
B
A
C
D
Yj
Xi
0
0
0
0
0
0
0
A
1
0
1
0
0
0
1
B
2
0
1
2
1
1
1
3
C
0
1
1
2
2
2
3
4
B
0
1
1
2
2
Time for trace back O(mn).
104
Finding LCS (2)
j 0 1 2 3 4
5
i
B
B
A
C
D
Yj
Xi
0
0
0
0
0
0
0
A
1
0
1
0
0
0
1
B
2
0
1
2
1
1
1
3
C
0
1
1
2
2
2
3
4
B
0
1
1
2
2
LCS (reversed order)
B
C
B
LCS (straight order)
B C B (this string turned out to be a
palindrome)
105
LCS as a longest path problem
B
B
A
C
D
A
1
1
1
B
1
C
B
1
1
106
LCS as a longest path problem
B
B
A
C
D
0
0
0
0
0
0
A
1
0
0
0
0
1
1
1
1
B
0
1
1
1
1
2
1
C
0
1
1
2
2
2
B
1
1
0
1
1
1
2
3
107
A more general problem
  • Aligning two strings, such that
  • Match m
  • Mismatch -s
  • Insertion/deletion -d
  • Aligning ABBC with CABC
  • LCS 3 ABC
  • Best alignment

108
Alignment as a longest path problem
C
C
B
A




A
B
B
C
-d
m
-s
109
Recurrence
  • Let F(i, j) be the best alignment score between
    X1..i and Y1..j.
  • F(m, n) is the best alignment score between X and
    Y
  • Recurrence
  • F(i, j) max

F(i-1, j-1) ?(i, j) F(i-1,j) d F(i, j-1) d
Match/Mismatch
Insertion on Y
Insertion on X
?(i, j) m if XiYj and -s otherwise.
110
Restaurant location problem
  • You work in the fast food business
  • Your company plans to open up new restaurants in
    Texas along I-35
  • Towns along the highway called t1, t2, , tn
  • Restaurants at ti has estimated annual profit pi
  • No two restaurants can be located within 10 miles
    of each other due to some regulation
  • Your boss wants to maximize the total profit
  • You want a big bonus

10 mile
111
A DP algorithm
  • Suppose youve already found the optimal solution
  • It will either include tn or not include tn
  • Case 1 tn not included in optimal solution
  • Best solution same as best solution for t1 , ,
    tn-1
  • Case 2 tn included in optimal solution
  • Best solution is pn best solution for t1 , ,
    tj , where j lt n is the largest index so that
    dist(tj, tn) 10

112
Recurrence formulation
  • Let S(i) be the total profit of the optimal
    solution when the first i towns are considered
    (not necessarily selected)
  • S(n) is the optimal solution to the complete
    problem

Number of sub-problems n. Boundary condition
S(0) 0.
113
Example
Distance (mi)
100
5
2
2
6
6
6
3
10
7
dummy
7
3
4
12
0
6
7
9
8
3
2
4
12
5
Profit (100k)
3
S(i)
6
7
9
9
12
12
14
26
26
10
Optimal 26
S(i-1) S(j) pi j lt i dist (tj, ti) 10
S(i) max
  • Natural greedy 1 6 3 4 12 25
  • Natural greedy 2 12 9 3 24

114
Complexity
  • Time ?(nk), where k is the maximum number of
    towns that are within 10 miles to the left of any
    town
  • In the worst case, ?(n2)
  • Can be improved to ?(n) with some preprocessing
    tricks
  • Memory T(n)

115
Knapsack problem
  • Each item has a value and a weight
  • Objective maximize value
  • Constraint knapsack has a weight limitation

Three versions 0-1 knapsack problem take each
item or leave it Fractional knapsack problem
items are divisible Unbounded knapsack problem
unlimited supplies of each item. Which one is
easiest to solve?
We study the 0-1 problem today.
116
Formal definition (0-1 problem)
  • Knapsack has weight limit W
  • Items labeled 1, 2, , n (arbitrarily)
  • Items have weights w1, w2, , wn
  • Assume all weights are integers
  • For practical reason, only consider wi lt W
  • Items have values v1, v2, , vn
  • Objective find a subset of items, S, such that
    ?i?S wi ? W and ?i?S vi is maximal among all such
    (feasible) subsets

117
A DP algorithm
  • Suppose youve find the optimal solution S
  • Case 1 item n is included
  • Case 2 item n is not included

Total weight limit W
Total weight limit W
wn
wn
Find an optimal solution using items 1, 2, , n-1
with weight limit W - wn
Find an optimal solution using items 1, 2, , n-1
with weight limit W
118
Recursive formulation
  • Let Vi, w be the optimal total value when items
    1, 2, , i are considered for a knapsack with
    weight limit w
  • gt Vn, W is the optimal solution

Boundary condition Vi, 0 0, V0, w 0.
Number of sub-problems ?
119
Example
  • n 6 ( of items)
  • W 10 (weight limit)
  • Items (weight, value)
  • 2 2
  • 4 3
  • 3 3
  • 5 6
  • 2 4
  • 6 9

120
w 0 1 2 3 4 5 6 7 8 9 10
0
0
0
0
0
0
0
0
0
0
0
i
vi
wi
0
1
2
2
0
2
3
4
0
3
3
3
0
4
6
5
6
5
Vi, w
0
4
2
5
0
6
9
6
Vi-1, w-wi vi item i is taken Vi-1, w
item i not taken
max
Vi, w
Vi-1, w if wi gt w item i not taken
121
w 0 1 2 3 4 5 6 7 8 9 10
0
0
0
0
0
0
0
0
0
0
0
i wi vi
1 2 2
2 4 3
3 3 3
4 5 6
5 2 4
6 6 9
2
2
2
2
2
2
2
2
0
0
5
5
5
5
3
2
2
0
0
8
6
5
3
2
0
0
9
6
3
3
2
0
0
10
7
4
0
0
13
10
7
6
4
4
0
0
Vi-1, w-wi vi item i is taken Vi-1, w
item i not taken
max
Vi, w
122
w 0 1 2 3 4 5 6 7 8 9 10
0
0
0
0
0
0
0
0
0
0
0
i wi vi
1 2 2
2 4 3
3 3 3
4 5 6
5 2 4
6 6 9
2
2
2
2
2
2
2
2
0
0
2
5
5
5
5
3
2
2
0
0
8
6
5
3
2
0
0
9
6
3
3
2
0
0
10
7
4
0
0
6
13
10
7
6
4
4
0
0
15
Optimal value 15
Item 6, 5, 1 Weight 6 2 2 10 Value 9 4
2 15
123
Time complexity
  • T (nW)
  • Polynomial?
  • Pseudo-polynomial
  • Works well if W is small
  • Consider following items (weight, value)
  • (10, 5), (15, 6), (20, 5), (18, 6)
  • Weight limit 35
  • Optimal solution item 2, 4 (value 12).
    Iterate 24 16 subsets
  • Dynamic programming fill up a 4 x 35 140 table
    entries
  • Whats the problem?
  • Many entries are unused no such weight
    combination
  • Top-down may be better

124
Use DP algorithm to solve new problems
  • Directly map a new problem to a known problem
  • Modify an algorithm for a similar task
  • Design your own
  • Think about the problem recursively
  • Optimal solution to a larger problem can be
    computed from the optimal solution of one or more
    subproblems
  • These sub-problems can be solved in certain
    manageable order
  • Works nicely for naturally ordered data such as
    strings, trees, some special graphs
  • Trickier for general graphs
  • The text book has some very good exercises.

125
Greedy algorithm for restaurant location problem
  • select t1
  • d 0
  • for (i 2 to n)
  • d d dist(ti, ti-1)
  • if (d gt min_dist)
  • select ti
  • d 0
  • end
  • end

5
2
2
6
6
6
3
10
7
6
9
15
7
d
5
7
9
15
10
0
0
126
Complexity
  • Time T(n)
  • Memory
  • T(n) to store the input
  • T(1) for greedy selection

127
Knapsack problem
  • Each item has a value and a weight
  • Objective maximize value
  • Constraint knapsack has a weight limitation

Three versions 0-1 knapsack problem take each
item or leave it Fractional knapsack problem
items are divisible Unbounded knapsack problem
unlimited supplies of each item. Which one is
easiest to solve?
We can solve the fractional knapsack problem
using greedy algorithm
128
Greedy algorithm for fractional knapsack problem
  • Compute value/weight ratio for each item
  • Sort items by their value/weight ratio into
    decreasing order
  • Call the remaining item with the highest ratio
    the most valuable item (MVI)
  • Iteratively
  • If the weight limit can not be reached by adding
    MVI
  • Select MVI
  • Otherwise select MVI partially until weight limit

129
Example
  • Weight limit 10

130
Example
  • Weight limit 10
  • Take item 5
  • 2 LB, 4
  • Take item 6
  • 8 LB, 13
  • Take 2 LB of item 4
  • 10 LB, 15.4

item Weight (LB) Value () / LB
5 2 4 2
6 6 9 1.5
4 5 6 1.2
1 2 2 1
3 3 3 1
2 4 3 0.75
131
Why is greedy algorithm for fractional knapsack
problem valid?
  • Claim the optimal solution must contain the MVI
    as much as possible (either up to the weight
    limit or until MVI is exhausted)
  • Proof by contradiction suppose that the optimal
    solution does not use all available MVI (i.e.,
    there is still w (w lt W) units of MVI left while
    we choose other items)
  • We can replace w pounds of less valuable items by
    MVI
  • The total weight is the same, but with value
    higher than the optimal
  • Contradiction

132
Representing Graphs
  • Assume V 1, 2, , n
  • An adjacency matrix represents the graph as a n x
    n matrix A
  • Ai, j 1 if edge (i, j) ? E 0 if edge
    (i, j) ? E
  • For weighted graph
  • Ai, j wij if edge (i, j) ? E
  • 0 if edge (i, j) ? E
  • For undirected graph
  • Matrix is symmetric Ai, j Aj, i

133
Graphs Adjacency Matrix
  • Example

A 1 2 3 4
1
2
3 ??
4
1
2
4
3
134
Graphs Adjacency Matrix
  • Example

A 1 2 3 4
1 0 1 1 0
2 0 0 1 0
3 0 0 0 0
4 0 0 1 0
1
2
4
3
How much storage does the adjacency matrix
require? A O(V2)
135
Graphs Adjacency Matrix
  • Example

4
3
2
1
A
1
1
2
2
4
3
3
4
Undirected graph
136
Graphs Adjacency Matrix
  • Example

4
3
2
1
A
1
1
5
2
6
2
4
3
9
4
3
4
Weighted graph
137
Graphs Adjacency Matrix
  • Time to answer if there is an edge between vertex
    u and v T(1)
  • Memory required T(n2) regardless of E
  • Usually too much storage for large graphs
  • But can be very efficient for small graphs
  • Most large interesting graphs are sparse
  • E.g., road networks (due to limit on junctions)
  • For this reason the adjacency list is often a
    more appropriate representation

138
Graphs Adjacency List
  • Adjacency list for each vertex v ? V, store a
    list of vertices adjacent to v
  • Example
  • Adj1 2,3
  • Adj2 3
  • Adj3
  • Adj4 3
  • Variation can also keep a list of edges coming
    into vertex

1
2
4
3
139
Graph representations
  • Adjacency list

1
2
3
3
2
4
3
3
How much storage does the adjacency list
require? A O(VE)
140
Graph representations
  • Undirected graph

1
2
4
3
2
3
1
3
1
2
4
3
141
Graph representations
  • Weighted graph

1
5
6
2
4
9
4
3
2,5
3,6
1,5
3,9
1,6
2,9
4,4
3,4
142
Graphs Adjacency List
  • How much storage is required?
  • For directed graphs
  • adjv out-degree(v)
  • Total of items in adjacency lists is ?
    out-degree(v) E
  • For undirected graphs
  • adjv degree(v)
  • items in adjacency lists is? degree(v) 2 E
  • So Adjacency lists take ?(VE) storage
  • Time needed to test if edge (u, v) ? E is O(n)

143
Tradeoffs between the two representations
V n, E m
Adj Matrix Adj List
test (u, v) ? E T(1) O(n)
Degree(u) T(n) O(n)
Memory T(n2) T(nm)
Edge insertion T(1) T(1)
Edge deletion T(1) O(n)
Graph traversal T(n2) T(nm)
Both representations are very useful and have
different properties, although adjacency lists
are probably better for more problems
144
  • Good luck with your exam!
Write a Comment
User Comments (0)
About PowerShow.com