2IL65 Algorithms - PowerPoint PPT Presentation

1 / 42
About This Presentation
Title:

2IL65 Algorithms

Description:

O-notation. Let g(n) : N N be a function. ... other asymptotic notation. o(...) 'grows strictly ... More notation ... f(n) = n3 T(n2) means. f(n) = means. O(1) ... – PowerPoint PPT presentation

Number of Views:37
Avg rating:3.0/5.0
Slides: 43
Provided by: bettinas
Category:

less

Transcript and Presenter's Notes

Title: 2IL65 Algorithms


1
2IL65 Algorithms
  • Fall 2009Lecture 2 Analysis of Algorithms

2
Analysis of algorithms
the formal way
3
Analysis of algorithms
  • Can we say something about the running time of an
    algorithm without implementing and testing it?
  • InsertionSort(A)
  • initialize sort A1
  • for j ? 2 to lengthA
  • do key ? Aj
  • i ? j -1
  • while i gt 0 and Ai gt key
  • do Ai1 ? Ai
  • i ? i -1
  • Ai 1 ? key

4
Analysis of algorithms
  • Analyze the running time as a function of n ( of
    input elements)
  • best case
  • average case
  • worst case
  • elementary operationsadd, subtract, multiply,
    divide, load, store, copy, conditional and
    unconditional branch, return

An algorithm has worst case running time T(n) if
for any input of size n the maximal number of
elementary operations executed is T(n).
5
Analysis of algorithms example
InsertionSort 15 n2 7n 2
MergeSort 300 n lg n 50 n
The rate of growth of the running time as a
function of the input is essential!
n 1,000,000 InsertionSort 1.5 x 1013
MergeSort 6 x
109 2500 x faster !
6
O-notation
  • Let g(n) N ? N be a function. Then we have
  • O(g(n)) f(n) there exist positive constants
    c and n0 such that 0
    f(n) cg(n) for all n n0
  • O(g(n)) is the set of functions that grow at
    most as fast as g(n)

7
O-notation
  • Let g(n) N ? N be a function. Then we have
  • O(g(n)) f(n) there exist positive constants
    c and n0 such that 0
    f(n) cg(n) for all n n0

Notation f(n) O(g(n))
8
O-notation
  • Let g(n) N ? N be a function. Then we have
  • O(g(n)) f(n) there exist positive constants
    c and n0 such that 0
    cg(n) f(n) for all n n0
  • O(g(n)) is the set of functions that grow at
    least as fast as g(n)

9
O-notation
  • Let g(n) N ? N be a function. Then we have
  • O(g(n)) f(n) there exist positive constants
    c and n0 such that 0
    cg(n) f(n) for all n n0

Notation f(n) O(g(n))
10
T-notation
  • Intuition concentrate on the leading term,
    ignore constants
  • 19 n3 17 n2 - 3n becomes T(n3)
  • 2 n lg n 5 n1.1 - 5 becomes

T(n1.1)
11
T-notation
  • Let g(n) N ? N be a function. Then we have
  • T(g(n)) f(n) there exist positive constants
    c1, c2, and n0 such that 0
    c1g(n) f(n) c2g(n) for all n n0
  • T(g(n)) is the set of functions that grow as
    fast as g(n)

12
T-notation
  • Let g(n) N ? N be a function. Then we have
  • T(g(n)) f(n) there exist positive constants
    c1, c2, and n0 such that 0
    c1g(n) f(n) c2g(n) for all n n0

Notation f(n) T(g(n))
13
T-notation
  • Let g(n) N ? N be a function. Then we have
  • T(g(n)) f(n) there exist positive constants
    c1, c2, and n0 such that 0
    c1g(n) f(n) c2g(n) for all n n0
  • Claim 19n3 17n2 - 3n T(n3)
  • Proof Choose c1 19, c2 36 and n0 1.
  • Then we have for all n n0
  • 0 c1n3 (trivial)
  • 19n3 17n2 - 3n
    (since 17n2 gt 3n for n 1)
  • c2n3
    (since 17n2 17n3 for n 1)

14
T-notation
  • Let g(n) N ? N be a function. Then we have
  • T(g(n)) f(n) there exist positive constants
    c1, c2, and n0 such that 0 c1g(n)
    f(n) c2g(n) for all n n0
  • Claim 19n3 17n2 - 3n ? T(n2)
  • Proof Assume that there are positive constants
    c1, c2, and n0 such that for all n n0
  • 0 c1n2 19n3 17n2 - 3n c2n2
  • Since 3n 17n2 0 we would have for all n
    n0 19n3 c2n2
  • and hence 19n c2.

15
Asymptotic notation
  • T() is an asymptotically tight bound
  • O() is an asymptotic upper bound
  • O() is an asymptotic lower bound
  • other asymptotic notation o() ? grows
    strictly slower than ?() ? grows strictly
    faster than

asymptotically equal
asymptotically smaller or equal
asymptotically greater or equal
16
More notation
  • f(n) n3 T(n2) means
  • f(n) means
  • O(1) or T(1) means
  • 2n2 O(n) T(n2) means
  • there is a function g(n) such that
  • f(n) n3 g(n) and g(n) T(n2)
  • there is one function g(i) such that
  • f(n) and g(i) O(i)
  • a constant
  • for each function g(n) with g(n)O(n)
  • we have 2n2 g(n) T(n2)

17
Quiz
  • O(1) O(1) O(1)
  • O(1) O(1) O(1)
  • O(n2) O(n3)
  • O(n3) O(n2)
  • T(n2) O(n3)
  • An algorithm with worst case running time O(n log
    n) is always slower than an algorithm with worst
    case running time O(n) if n is sufficiently large.
  • true
  • false
  • true
  • true
  • false
  • true
  • false

18
Quiz
  • n log2 n T(n log n)
  • n log2 n O(n log n)
  • n log2 n O(n4/3)
  • O(2n) O(3n)
  • O(2n) T(3n)
  • false
  • true
  • true
  • true
  • false

19
Analysis of algorithms
20
Analysis of InsertionSort
  • InsertionSort(A)
  • initialize sort A1
  • for j ? 2 to lengthA
  • do key ? Aj
  • i ? j -1
  • while i gt 0 and Ai gt key
  • do Ai1 ? Ai
  • i ? i -1
  • Ai 1 ? key
  • Get as tight a bound as possible on the worst
    case running time.
  • ? lower and upper bound for worst case running
    time
  • Upper bound Analyze worst case number of
    elementary operations
  • Lower bound Give bad input example

21
Analysis of InsertionSort
  • InsertionSort(A)
  • initialize sort A1
  • for j ? 2 to lengthA
  • do key ? Aj
  • i ? j -1
  • while i gt 0 and Ai gt key
  • do Ai1 ? Ai
  • i ? i -1
  • Ai 1 ? key
  • Upper bound Let T(n) be the worst case running
    time of InsertionSort on an array of
    length n. We have
  • T(n)
  • Lower bound

O(1)
O(1)
worst case(j-1) O(1)
O(1)
O(1) (j-1)O(1) O(1)
O(j)
O(n2)
O(1)
Array sorted in de-creasing order ?
O(n2)
The worst case running time of InsertionSort is
T(n2).
22
Analysis of MergeSort
  • MergeSort(A)
  • ? divide-and-conquer algorithm that sorts array
    A1..n
  • if lengthA 1
  • then skip
  • else
  • n ? lengthA n1 ? floor(n/2) n2 ?
    ceil(n/2)
  • copy A1.. n1 to auxiliary array
    A11.. n1
  • copy An11..n to auxiliary array
    A21.. n2
  • MergeSort(A1) MergeSort(A2)
  • Merge(A, A1, A2)

O(1)
O(1)
O(n)
O(n)
??
O(n)
T( n/2 ) T( n/2 )
MergeSort is a recursive algorithm ? running
time analysis leads to recursion
23
Analysis of MergeSort
  • Let T(n) be the worst case running time of
    MergeSort on an array of length n. We have
  • O(1) if n
    1
  • T(n)
  • T( n/2 ) T( n/2 ) T(n)
    if n gt 1

frequently omitted since it (nearly) always holds
often written as 2T(n/2)
24
Solving recurrences
25
Solving recurrences
  • Easiest Master theoremcaveat not always
    applicable
  • Alternatively Guess the solution and use the
    substitution method to prove that your guess it
    is correct.
  • How to guess
  • expand the recursion
  • draw a recursion tree

26
The master theorem
  • Let a and b be constants, let f(n) be a function,
    and let T(n)
  • be defined on the nonnegative integers by the
    recurrence
  • T(n) aT(n/b) T(f(n))
  • Then we have
  • If f(n) O(nlog a e) for some constant e gt 0,
    then T(n) T(nlog a).
  • If f(n) T(nlog a), then T(n) T(nlog a log
    n)
  • If f(n) O(nlog a e) for some constant e gt 0,
    and if af(n/b) cf(n) for some constant c lt 1
    and all sufficiently large n, then T(n) T(f(n))

can be rounded up or down
note logba - e
b
b
b
b
b
27
The master theorem Example
  • T(n) 4T(n/2) T(n3)
  • Master theorem with a 4, b 2, and f(n) n3
  • logba log24 2
  • ? n3 f(n) O(nlog a e) O(n2 e) with,
    for example, e 1
  • Case 3 of the master theorem gives T(n) T(n3),
    if the regularity condition holds.
  • choose c ½ and n0 1
  • ? af(n/b) 4(n/2)3 n3/2 cf(n) for n
    n0
  • ? T(n) T(n3)

b
28
The substitution method
  • The Master theorem does not always apply
  • In those cases, use the substitution method
  • Guess the form of the solution.
  • Use induction to find the constants and show that
    the solution works
  • Use expansion or a recursion-tree to guess a good
    solution.

29
Recursion-trees
  • T(n) 2T(n/2) n

n
n/2 n/2
n/4 n/4 n/4 n/4
n/2i n/2i
n/2i
T(1) T(1)
T(1)
30
Recursion-trees
  • T(n) 2T(n/2) n

n
2 (n/2) n
4 (n/4) n
log n
2i (n/2i) n
n T(1) T(n)

T(n log n)
31
Recursion-trees
  • T(n) 2T(n/2) n2

n2
(n/2)2 (n/2)2
(n/4)2 (n/4)2 (n/4)2 (n/4)2
(n/2i)2 (n/2i)2
(n/2i)2
T(1) T(1)
T(1)
32
Recursion-trees
  • T(n) 2T(n/2) n2

n2
2 (n/2)2 n2/2
4 (n/4)2 n2/4
2i (n/2i) 2 n2/2i
n T(1) T(n)

T(n2)
33
Recursion-trees
  • T(n) 4T(n/2) n

n
n/2 n/2 n/2 n/2
n/4 n/4 n/4 n/4
T(1) T(1) T(1)
34
Recursion-trees
  • T(n) 4T(n/2) n

n
n
n/2 n/2 n/2 n/2
4 (n/2) 2n
n/4 n/4 n/4 n/4
16 (n/4) 4n
T(1) T(1) T(1)
n2 T(1) T(n2)

T(n2)
35
The substitution method
2 if n 1 2T( n/2 ) n if n
gt 1
T(n)
  • Claim T(n) O(n log n)
  • Proof by induction on n
  • to show there are constants c and n0 such that
  • T(n) c n log n for all n n0
  • T(1) 2 ? choose c 2 (for now) and n0 2
  • Why n0 2?
  • How many base cases?
  • Base cases n 2 T(2) 2T(1) 2
    22 2 6 c 2 log 2 for c 3
  • n 3 T(3) 2T(1) 2 22 2 6 c
    3 log 3

log 1 0 ? can not prove bound for n 1
36
The substitution method
2 if n 1 2T( n/2 ) n if n
gt 1
T(n)
  • Claim T(n) O(n log n)
  • Proof by induction on n
  • to show there are constants c and n0 such that
  • T(n) c n log n for all n n0
  • choose c 3 and n0 2
  • Inductive step n gt 3
  • T(n) 2T( n/2 ) n
  • 2 c n/2
    log n/2 n (ind. hyp.)
  • c n
    ((log n) - 1) n
  • c n log n

37
The substitution method
T(1) if n 1 2T( n/2 ) n if n gt 1
T(n)
  • Claim T(n) O(n)
  • Proof by induction on n
  • Base case n n0
  • T(2) 2T(1) 2 2c 2 O(2)
  • Inductive step n gt n0
  • T(n) 2T( n/2 ) n
  • 2O( n/2 ) n (ind.
    hyp.)
  • O(n)

Never use O, T, or O in a proof by induction!
38
Analysis of algorithms
one more example
39
Example
  • Example (A)
  • ? A is an array of length n
  • n ? lengthA
  • if n1
  • then return A1
  • else begin
  • Copy A1 n/2 to auxiliary
    array B1... n/2
  • Copy A1 n/2 to auxiliary
    array C1 n/2
  • b ? Example(B) c ? Example(C)
  • for i ? 1 to n
  • do for j ? 1 to i
  • do Ai ?
    Aj
  • return 43
  • end

40
Example
  • Let T(n) be the worst case running time of
    Example on an array of length n.
  • Lines 1,2,3,4,11, and 12 take T(1) time.
  • Lines 5 and 6 take T(n) time.
  • Line 7 takes T(1) 2 T( n/2 ) time.
  • Lines 8 until 10 take
  • time.
  • If n1 lines 1,2,3 are executed,
  • else lines 1,2, and 4 until 12 are executed.
  • ? T(n)
  • ? use master theorem
  • Example (A)
  • ? A is an array of length n
  • n ? lengthA
  • if n1
  • then return A1
  • else begin
  • Copy A1 n/2 to auxiliary
    array B1... n/2
  • Copy A1 n/2 to auxiliary
    array C1 n/2
  • b ? Example(B) c ? Example(C)
  • for i ? 1 to n
  • do for j ? 1 to i
  • do Ai ?
    Aj
  • return 43
  • end

41
Tips
  • Analysis of recursive algorithmsfind the
    recursion and solve with master theorem if
    possible
  • Analysis of loops summations
  • Some standard recurrences and sums
  • T(n) 2T(n/2) T(n) ?
  • ½ n(n1) T(n2)
  • T(n3)

T(n) T(n log n)
42
Tutorials this week
  • Small tutorial on Thursday 78.
  • Monday 56 big tutorial.
Write a Comment
User Comments (0)
About PowerShow.com