Algorithm Efficiency - PowerPoint PPT Presentation

About This Presentation
Title:

Algorithm Efficiency

Description:

Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them? At the heart of a computer program design are two ... – PowerPoint PPT presentation

Number of Views:60
Avg rating:3.0/5.0
Slides: 29
Provided by: William1100
Category:

less

Transcript and Presenter's Notes

Title: Algorithm Efficiency


1
Algorithm Efficiency
  • There are often many approaches (algorithms) to
    solve a problem. How do we choose between them?
  • At the heart of a computer program design are two
    (sometimes conflicting) goals
  • 1. To design an algorithm that is easy to
    understand, code, and debug.
  • 2. To design an algorithm that makes efficient
    use of the computers resources.
  • Goal 1 is the concern of Software Engineering.
  • Goal 2 is the concern of data structures and
    algorithm analysis.
  • When goal 2 is important, how do we measure an
    algorithms cost?

2
How to Measure Efficiency?
  • Empirical comparison (run the programs).
  • Only valid for that machine.
  • Only valid for that compiler.
  • Only valid for that coding of the algorithm.
  • Asymptotic Algorithm Analysis
  • Must identify critical resources
  • time - where we will concentrate
  • space
  • Identify factors affecting that resource
  • For most algorithms, running time depends on
    size of the input.
  • Running time is expressed as T(n) for some
    function T on input size n.

3
Examples of Growth Rate
  • Example 1
  • int largest (int array, int n) // does not
    work for all cases!!!!!
  • int currlarge 0
  • for (int p0 pltn p)
  • if (arraypgtcurrlarge)
  • currlargearrayp
  • return currlarge
  • Example 2
  • sum 0
  • for (p1 pltn p)
  • for (j1 jltn j)
  • sum

4
Growth Rate Graphs
5
Expanded View
6
Typical Growth Rates
  • c constant
  • log N logarithmic
  • log2 N log-squared
  • N linear
  • N log N
  • N2 quadratic
  • N3 cubic
  • 2N exponential

7
Best, Worst and Average Cases
  • Not all inputs of a given size take the same
    time.
  • Sequential search for K in an array of n
    integers
  • Begin at the first element in array and look at
    each element in turn until K is found.
  • Best Case
  • Worst Case
  • Average Case
  • While average time seems to be the fairest
    measure, it may be difficult to determine.
  • When is the worst case time important?
  • Time critical events (real time processing).

8
Asymptotic Analysis Big-oh
  • Definition T(n) is in the set O(f(n)) if there
    exist two positive constants c and n0 such that
    T(n)lt cf(n) for all ngtn0.
  • Usage the algorithm is in O(n2) in best,
    average, worst case.
  • Meaning for all data sets big enough (i.e.,
    ngtn0), the algorithm always executes in less than
    cf(n) steps in best, average, worst case.
  • Upper Bound
  • Example if T(n)3n2 then T(n) is in O(n2).
  • Tightest upper bound
  • T(n)3n2 is in O(n3), we prefer O(n2).

9
Big-oh Example
  • Example 1. Finding the value X in an array.
  • T(n)csn/2.
  • For all values of ngt1, csn/2ltcsn.
    Therefore, by the definition, T(n) is in O(n) for
    n01 and ccs.
  • Example 2. T(n)c1n2c2n in the average case
  • c1n2c2n lt c1n2c2n2 lt(c1c2)n2 for all
    ngt1.
  • Therefore, T(n) is in O(n2).
  • Example 3 T(n)c. This is in O(1).

10
Big-Omega
  • Definition T(n) is in the set O(g(n)) if there
    exist two positive constants c and n0 such that
    T(n) gt cg(n) for all ngtn0.
  • Meaning For all data sets big enough (i.e., ngtn0
    ), the algorithm always executes in more than
    cg(n) steps.
  • It is a LOWER bound.
  • Example T(n)c1n2c2n
  • c1n2c2n gt c1n2 for all ngt1.
  • T(n)gt cn2 for cc1 and n01.
  • Therefore, T(n) is in O(n2) by the definition
  • We want the greatest lower bound.

11
Theta Notation
  • When big-Oh and O meet, we indicate this by using
    T (big-Theta) notation.
  • Definition an algorithm is said to be T(h(n)) if
    it is in O(h(n)) and it is in O (h(n)).
  • Simplifying rules
  • if f(n) is in O(g(n)) and g(n) is in O(h(n)) then
    f(n) is in O(h(n)).
  • if f(n) is in O(kg(n)) for any constant kgt0, then
    f(n) is in O(g(n)).
  • if f1(n) is in O(g1(n)) and f2(n) is in O(g2(n)),
    then (f1f2)(n) is in O(max(g1(n),g2(n))).
  • if f1(n) is in O(g1(n)) and f2(n) is in O(g2(n)),
    then f1(n)f2(n) is in O(g1(n)g2(n)).

12
Big O rules
  • If T(n) is a polynomial of degree k then T(n)
    T(nk).
  • logkn O(n) for any constant k. Logarithms grow
    very slowly.

13
General Algorithm Analysis Rules
  • The running time of a for loop is at most the
    running time of the statements inside the for
    loop times the number of iterations.
  • Analyze nested loops inside out. Then apply the
    previous rule.
  • Consecutive statements just add (so apply the max
    rule).
  • The running time of a if/else statement is never
    more than the running time of the test plus the
    larger of the times of the true and false case.

14
Running Time of a Program
  • Example 1 ab
  • this assignment statement takes constant time, so
    it is T(1)
  • Example 2
  • sum0
  • for (I1 Iltn I)
  • sumn
  • Example 3
  • sum0
  • for (j1 jltn j)
  • for (I1 Iltj I)
  • sum
  • for (k1 kltn k)
  • akk-1

15
More Examples
  • Example 4
  • sum10
  • for (I1 Iltn I)
  • for (j1 jltn j)
  • sum1
  • sum20
  • for (I1 Iltn I)
  • for (j1 jltI j)
  • sum2
  • Example 5
  • sum10
  • for (k1 kltn k2)
  • for (j1 jltn j)
  • sum1
  • sum20
  • for (k1 kltn k2)
  • for (j1 jltk j)
  • sum2

16
Binary Search
  • int binary (int value, int array, int size)
  • int left-1
  • int rightsize
  • while (left1! right)
  • int mid(leftright)/2
  • if (value lt arraymid) rightmid
  • else if (valuegtarraymid) leftmid
  • else return mid
  • return -1

17
Binary Search Example
  • Position Key
  • 0 11
  • 1 13
  • 2 21
  • 3 26
  • 4 29
  • 5 36
  • 6 40
  • 7 41
  • 8 45
  • 9 51
  • 10 54
  • 11 56
  • 12 65
  • 13 72
  • 14 77
  • 15 83

Now lets search for the value 45.
(015)/2 -gt 7
(815)/2 -gt 11
(810)/2 -gt 9
(88)/2 -gt 8
18
Unsuccessful Search
  • Position Key
  • 0 11
  • 1 13
  • 2 21
  • 3 26
  • 4 29
  • 5 36
  • 6 40
  • 7 41
  • 8 45
  • 9 51
  • 10 54
  • 11 56
  • 12 65
  • 13 72
  • 14 77
  • 15 83
  • How many elements are examined in the worse case?

Now lets search for the value 24
(015)/2 -gt7
(06)/2 -gt3
(02)/2 -gt1
(22)/2 -gt2
(32)/2 -gt?? Left1Right so stop
19
Case Study Maximum Subsequence
  • Given a sequence of integers a1, a2,, an, find
    the subsequence that gives you the largest sum.
  • Since there is no size limit on the subsequence,
    then if the sequence is all positive or all
    negative then the solution is trivial.

20
Simple Solution
  • Look at all possible combinations of start and
    stop positions of the subsequence.
  • for (i0 iltn i)
  • for (ji jltn j)
  • thissum0
  • for (ki kltjk)
  • thissumthissumak
  • if (thissumgtmaxsum) maxsumthissum

21
Analysis of Simple Solution
  • Inner loop is executed j-i1 times.
  • The middle loop changes j from i to n-1.
  • Looking at the j-I1 when j goes from I to n-1,
    we have 12(n-i).
  • So this is done (n-i1)(n-i)/2 times.
  • (n2-2nii2n-i)/2

22
More Analysis
  • The outer loop changes i from 0 to n-1.
  • n2 summed n times is n3
  • The sum of 2ni when i changes from 0 to n-1 is
    2n(n-1)(n)/2 n3-n2
  • The sum of i2 when i changes from 0 to n-1 is
    (n-1)(n)(2n-1)/6 (2n3-3n2-n)/6
  • The sum of n when i changes is n2.
  • The sum of i when i changes from 0 to n-1 is
    (n-1)(n)/2 (n2-n)/2.
  • Total is (n3 n3-n2 (2n3-3n2-n)/6 n2
    (n2-n)/2)/6
  • This is O(n3).

23
An improved Algorithm
  • Start at position i and find the sum of all
    subsequences that start at position i. Then
    repeat for all starting positions.
  • for (i0 iltn i)
  • thissum0
  • for (ji jltn j)
  • thissumthissumaj
  • if (thissumgtmaxsum) maxsumthissum

24
Analysis of Improved Algorithm
  • The inner loop goes from i to n-1.
  • When i is 0, this is n times
  • When i is 1 it is n-1 times
  • Until i is n-1 then 1 time
  • Summing this up backwards, we get 12n
    n(n1)/2(n2n)/2 O(n2)

25
Final great algorithm
  • for (j0 jltn j)
  • thissumthissumaj
  • if (thissumgtmaxsum)maxsumthissum
  • else
  • if(thissumlt0) thissum0
  • O(n)

26
Analyzing Problems
  • Upper bound The upper bound of best known
    algorithm to solve the problem.
  • Lower bound The lower bound for every possible
    algorithm to solve that problem, even unknown
    algorithms.
  • Example Sorting
  • Cost of I/O O(n)
  • Bubble or insertion sort O(n2)
  • A better sort (Quicksort Mergesort, Heapsort) O(n
    log n)
  • We prove in chapter 8 that sorting is O(n log n)

27
Multiple Parameters
  • Compute the rank ordering for all C pixel values
    in a picture of P pixels.
  • Monitors have a fixed number of colors (256, 16M,
    64M).
  • Need to count the number of each color and
    determine the most used and least used colors.
  • for (i0 iltC i)
  • counti0
  • for (i0 iltP i)
  • countvaluei
  • sort(count)
  • If we use P as the measure, then time is O(P
    logP)
  • Which is bigger, C or P? 600x4002400
    1024x10241M
  • More accurate is O(P C log C)

28
Space Bounds
  • Space bounds can also be analyzed with asymptotic
    complexity analysis.
  • Time Algorithm
  • Space Data Structure
  • Space/Time Tradeoff Principle
  • One can often achieve a reduction in time if one
    is willing to sacrifice space, or vice versa.
  • Encoding or packing information
  • Boolean flags- takes one bit, but a byte is the
    smallest storage, so pack 8 booleans into 1 byte.
    Takes more time, less space.
  • Table Lookup
  • Factorials - Compute once, use many times
  • Disk based Space/Time Tradeoff Principle
  • The smaller you can make your disk storage
    requirements, the faster your program will run.
    Disk is slow.
Write a Comment
User Comments (0)
About PowerShow.com