Chapter 2 Algorithm Analysis - PowerPoint PPT Presentation

About This Presentation
Title:

Chapter 2 Algorithm Analysis

Description:

Title: Chapter 2 Algorithm Analysis Last modified by: Ashok Srinivasan Document presentation format: On-screen Show (4:3) Company: Ashok Srinivasan – PowerPoint PPT presentation

Number of Views:37
Avg rating:3.0/5.0
Slides: 28
Provided by: csFsuEdu58
Learn more at: http://www.cs.fsu.edu
Category:

less

Transcript and Presenter's Notes

Title: Chapter 2 Algorithm Analysis


1
Chapter 2Algorithm Analysis
All sections
2
Complexity Analysis
  • Measures efficiency (time and memory) of
    algorithms and programs
  • Can be used for the following
  • Compare different algorithms
  • See how time varies with size of the input
  • Operation count
  • Count the number of operations that we expect to
    take the most time
  • Asymptotic analysis
  • See how fast time increases as the input size
    approaches infinity

2
3
Operation Count Examples
Example 1 for(i0 iltn i) cout ltlt Ai ltlt
endl
  • Example 2
  • template ltclass Tgt
  • bool IsSorted(T A, int n)
  • bool sorted true
  • for(int i0 iltn-1 i)
  • if(Ai gt Ai1)
  • sorted false
  • return sorted

Number of output n
Example 3 Triangular matrix- vector
multiplication pseudo- code ci ? 0, i 1 to
n for i 1 to n for j 1 to i ci
aij bj
Number of comparisons n - 1
Number of multiplications ?i1n i n(n1)/2
3
4
Scaling Analysis
  • How much will time increase in example 1, if n is
    doubled?
  • t(2n)/t(n) 2n/n 2
  • Time will double
  • If time t(n) 2n2 for some algorithm, then how
    much will time increase if the input size is
    doubled?
  • t(2n)/t(n) 2 (2n)2 / (2n 2) 4n 2 / n 2 4

4
5
Comparing Algorithms
  • Assume that algorithm 1 takes time t1(n)
    100nn2 and algorithm 2 takes time t2(n) 10n2
  • If an application typically has n lt 10, then
    which algorithms is faster?
  • If an application typically has n gt 100, then
    which algorithms is faster?
  • Assume algorithms with the following times
  • Algorithm 1 insert - n, delete - log n, lookup -
    1
  • Algorithm 2 insert - log n, delete - n, lookup -
    log n
  • Which algorithm is faster if an application has
    many inserts but few deletes and lookups?

5
6
Motivation for Asymptotic Analysis - 1
  • Compare x2 (red line) and x (blue line almost
    on x-axis)
  • x2 is much larger than x for large x

6
7
Motivation for Asymptotic Analysis - 2
  • Compare 0.0001x2 (red line) and x (blue line
    almost on x-axis)
  • 0.0001x2 is much larger than x for large x
  • The form (x2 versus x) is most important for
    large x

7
8
Motivation for Asymptotic Analysis - 3
  • Red 0.0001x2 , blue x, green 100 log x,
    magenta sum of these
  • 0.0001x2 primarily contributes to the sum for
    large x

8
9
Asymptotic Complexity Analysis
  • Compares growth of two functions
  • T f(n)
  • Variables non-negative integers
  • For example, size of input data
  • Values non-negative real numbers
  • For example, running time of an algorithm
  • Dependent on
  • Eventual (asymptotic) behavior
  • Independent of
  • constant multipliers
  • and lower-order effects
  • Metrics
  • Big O Notation O()
  • Big Omega Notation W()
  • Big Theta Notation T()

10
Big O Notation
  • f(n) O(g(n))
  • If and only if
  • there exist two positive constants c gt 0 and n0 gt
    0,
  • such that f(n) lt cg(n) for all n gt n0
  • iff ? c, n0 gt 0 0 lt f(n) lt cg(n) ? n gt n0

cg(n)
f(n)
f(n) is asymptotically upper bounded by g(n)
n0
11
Big Omega Notation
  • f(n) ?(g(n))
  • iff ? c, n0 gt 0 0 lt cg(n) lt f(n) ? n gt n0

f(n)
cg(n)
f(n) is asymptotically lower bounded by g(n)
n0
12
Big Theta Notation
  • f(n) ?(g(n))
  • iff ? c1, c2, n0 gt 0 0 lt c1g(n) lt f(n) lt c2g(n)
    ? n gt n0

c2g(n)
f(n)
f(n) has the same long-term rate of growth as
g(n)
c1g(n)
n0
13
Examples
  • f(n) 3n2 17
  • ?(1), ?(n), ?(n2) ?lower bounds
  • O(n2), O(n3), ? upper bounds
  • ?(n2) ? exact bound
  • f(n) 1000 n2 17 0.001 n3
  • ?(?) ?lower bounds
  • O(?) ? upper bounds
  • ?(?) ? exact bound

14
Analogous to Real Numbers
  • f(n) O(g(n)) (a lt b)
  • f(n) ?(g(n)) (a gt b)
  • f(n) ?(g(n)) (a b)
  • The above analogy is not quite accurate, but its
    convenient to think of function complexity in
    these terms.

15
Transitivity
  • f(n) O(g(n)) (a lt b)
  • f(n) ?(g(n)) (a gt b)
  • f(n) ?(g(n)) (a b)
  • If f(n) O(g(n)) and g(n) O(h(n))
  • Then f(n) O(h(n))
  • If f(n) ?(g(n)) and g(n) ?(h(n))
  • Then f(n) ?(h(n))
  • If f(n) ?(g(n)) and g(n) ?(h(n))
  • Then f(n) ?(h(n))
  • And many other properties

16
Some Rules of Thumb
  • If f(x) is a polynomial of degree k
  • Then f(x) ? (xk)
  • logkN O(N) for any constant k
  • Logarithms grow very slowly compared to even
    linear growth

17
Typical Growth Rates
18
Exercise
  • f(N) N logN and g(N) N1.5
  • Which one grows faster??
  • Note that g(N) N1.5 NN0.5
  • Hence, between f(N) and g(N), we only need to
    compare growth rate of log N and N0.5
  • Equivalently, we can compare growth rate of log2N
    with N
  • Now, refer to the result on the last slide to
    figure out whether f(N) or g(N) grows faster!

19
How Complexity Affects Running Times
19
20
Running Time Calculations - Loops
  • for (j 0 j lt n j)
  • // 3 atomics
  • Number of atomic operations
  • Each iteration has 3 atomic operations, so 3n
  • Cost of the iteration itself
  • One initialization assignment
  • n increment (of j)
  • n comparisons (between j and n)
  • Complexity ?(3n) ?(n)

21
Loops with Break
  • for (j 0 j lt n j)
  • // 3 atomics
  • if (condition) break
  • Upper bound O(4n) O(n)
  • Lower bound O(4) O(1)
  • Complexity O(n)
  • Why dont we have a ?() notation here?

22
Sequential Search
  • Given an unsorted vector a , find the location
    of element X.
  • for (i 0 i lt n i)
  • if (ai X) return true
  • return false
  • Input size n a.size()
  • Complexity O(n)

23
If-then-else Statement
if(condition) i 0 else for ( j 0 j lt n
j) aj j
  • Complexity ??
  • O(1) max ( O(1), O(N))
  • O(1) O(N)
  • O(N)

24
Consecutive Statements
  • Add the complexity of consecutive statements
  • Complexity ?(3n 5n) ?(n)

for (j 0 j lt n j) // 3 atomics for (j
0 j lt n j) // 5 atomics
25
Nested Loop Statements
  • Analyze such statements inside out
  • for (j 0 j lt n j)
  • // 2 atomics
  • for (k 0 k lt n k)
  • // 3 atomics
  • Complexity ?((2 3n)n) ?(n2)

26
Recursion
  • long factorial( int n )
  • if( n lt 1 )
  • return 1
  • else
  • return nfactorial(n- 1)

In terms of big-Oh t(1) 1 t(n) 1 t(n-1)
1 1 t(n-2) ... k t(n-k) Choose k
n-1 t(n) n-1 t(1) n-1 1 O(n)
Consider the following time complexity t(0) 1
t(n) 1 2t(n-1) 1 2(1 2t(n-2)) 1 2
4t(n-2) 1 2 4(1 2t(n-3)) 1 2 4
8t(n-3) 1 2 ... 2k-1 2kt(n-k) Choose k
n t(n) 1 2 ... 2n-1 2n 2n1 - 1
27
Binary Search
  • Given a sorted vector a , find the location of
    element X
  • unsigned int binary_search(vectorltintgt a, int
    X)
  • unsigned int low 0, high a.size()-1
  • while (low lt high)
  • int mid (low high) / 2
  • if (amid lt X)
  • low mid 1
  • else if( amid gt X )
  • high mid - 1
  • else
  • return mid
  • return NOT_FOUND
  • Input size n a.size()
  • Complexity O( k iterations x (1 comparison1
    assignment) per loop)
  • O(log(n))
Write a Comment
User Comments (0)
About PowerShow.com