Introduction in Computer Science 2 Asymptotic Complexity - PowerPoint PPT Presentation


PPT – Introduction in Computer Science 2 Asymptotic Complexity PowerPoint presentation | free to view - id: 276a34-ZDc1Z


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation

Introduction in Computer Science 2 Asymptotic Complexity


allows to compare it to alternative algorithms ... If f(n) / g(n) c for some n n0 then f = O(g) Example: lim f(n) / g(n) c. n. n2. lim ... – PowerPoint PPT presentation

Number of Views:62
Avg rating:3.0/5.0
Slides: 31
Provided by: kris48


Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Introduction in Computer Science 2 Asymptotic Complexity

Introduction in Computer Science 2 Asymptotic
DEEDS Group - TU Darmstadt Prof. Neeraj
Suri Constantin Sarbu Brahim Ayari Dan
Dobre Abdelmajid Khelil
Remember Sequential Search
Given Array A of integers and a constant
c. Question Is c in A?
  • Memory Complexity (in Java)
  • int 4 bytes,
  • boolean 1 byte
  • Memory used
  • size(A) size(c) size(n)
  • size(i) size(found)
  • n413

boolean contains (int A, int c) int n
A.length boolean found false for(int i
0, i lt n i) if (A i c) found
true return (found)
Time Complexity counting operations
Input A c n Assignment Comparisons
Array access Increments 1,4,2,7 6 4
1110 44 4 4 2,7,6,1 2
4 1111 44 4
4 2,1,8,4,19,7,16,3 5 8 1110 88
8 8 4,4,4,4,4,4 4 6
1116 66 6 6
Why Asymptotic Complexity?
  • Time complexity
  • gives a simple characterization of an algorithms
  • allows to compare it to alternative algorithms
  • In the last lecture we determined exact running
    time, but extra precision usually doesnt worth
    the effort of computing it
  • Large input sizes constants and lower order
    terms are ruled out
  • This means we are studying asymptotic complexity
    of algorithms
  • ? we are interested in how the running time
    increases with the size of the input in the limit
  • Usually, an algorithm which is asymptotically
    more efficient is not the best choice for very
    small inputs )

Today Efficiency Metrics - Complexity
  • Upper Bounds O (big O) - Notation
  • Properties, proof f ? O(g), sum and product rules
  • Loops, conditional statements, conditions,
  • Examples Sequential search, selection sort
  • Lower Bounds ? (Omega) Notation
  • Bands ? (Theta) - Notation

Asymptotic Time Complexity Upper Bound
c g(n)
c gt 0 ? n0 ? ? n gt n0 cg(n) gt f(n)
O-Notation (pronounce big-Oh)
  • Given f N ? R g N ? R
  • Definition
  • O(g) f ?n0?N, c?R, c gt 0 ?n ? n0 f(n) ?
  • Intuitively
  • O(g) the set of all functions f, that grow,
    at most, as fast as g
  • One says
  • If f ? O(g), then g is an asymptotical upper
    bound for f

  • O(n4) …, n, n2, nlogn, n3, n4, 3n4, cn3, …
  • n3 ? O(n4)
  • nlogn ? O(n4)
  • n4 ? O(n4)
  • Generally slower growth ? O (faster growths)

  • Often shortened as f O(g) instead of f ? O(g)
  • But f O(g) is no equality in the common
    meaning, only interpretable from left to right!
  • Normally, for analysis of algorithms
  • f N ? N and g N ? N,
  • since the input is the size of the input data and
    the value is the amount of elementary operations
  • For average case analysis the set R is also
  • f N ? R and g N ? R

Example O-Notation
  • T1(n) n 3 ? O(n) because n 3 ? 2n ?n ? 3
  • T2(n) 3n 7 ? O(n)
  • T3(n) 1000n ? O(n)
  • T4(n) 695n2 397n 6148 ? O(n2)
  • Functions are mostly monotonically increasing and
    ? 0.
  • Criteria for finding f ? O(g)
  • If f(n) / g(n) ? c for some n ? n0 then f
  • Example

lim f(n) / g(n) ? c
n ??
Proving that f ? O(g)
  • The proof has two parts
  • Finding the closed form
  • Solving the inequality f(n) ? c.g(n) from the
  • Illustration using an example
  • A is an algorithm, which sorts a set of numbers
    in increasing order
  • Assumption A performs according to f(n) 3 6
    9 ... 3n
  • Proposition A has the complexity O(n2)
  • Closed form for f(n) 3 6 9 ... 3n
  • f(n) 3(123...n) 3n(n1)/2

Proving that f ? O(g)
  • Task Find a value c, for which 3n(n1)/2 ?
    cn2 (for n gt one n0)
  • Try c3 3n(n1)/2 ? 3n2 n2 n ? 2n2 n ?
    n2 1 ? n for all n ? 1 Q.E.D.

Consequences of the O-Notation
  • O-Notation is a simplification
  • It eliminates constants O(n) O(n/2) O(17n)
  • It forms an upper bound, i.e.
  • from f(n) ? O(n log2n) follows that f(n) ? O(n2)
  • For O-Notation the basis for logarithms is
    irrelevant, as

Properties of O-Notation
  • Inclusion relations of the O-Notation O(1) ?
    O(log n) ? O(n) ? O(n log n) ? O(n2) ? O(n3)
    ? O(2n) ? O(10n)
  • ? We try to set the bounds as tight as possible
  • Rule

Calculating the Time Complexity
  • The time complexity of a program comes from the
    complexity of its parts
  • The complexity of the elementary operations is
    O(1) (elementary operation e.g. assignment,
    comparison, arithmetic operations, array
    access, …)
  • A defined sequence of elementary operations
    (independent of the input size n) also has the
    complexity O(1)

Sum and Product Rules
  • Given the time complexities of two algorithms T1
    and T2
  • Summation rule For the execution of T1 followed
    by T2
  • Product rule For the nested execution of T1 and

Loops in Series
  • Loops in series (n and m are the problem
    sizes) for (int i 0 i lt n i) operation
  • for (int j 0 j lt m j) operation
  • Complexity O(nm) O(max(n,m)) (sum rule)

Nested Loops
  • Nested loops (n is the problem size)
  • When inner loop execution is not dependent on the
    problem size, e.g. for (int i 0 i lt n i)
    for (int j 0 j lt 17 j)
    operation Complexity O(17n) O(n) (Product
  • Otherwise for (int i 0 i lt n i) for
    (int j 0 j lt n j) operation Complexity
  • (Product rule)
  • Ex read the data from a n x n matrix -gt very
    expensive (O(n2))!

Conditional Statement
  • Conditional Statement if B then T1
  • else T2
  • Cost of if is constant, therefore negligible
  • T(n)T1(n) or T(n)T2(n)
  • Good (if decidable) Longer sequences are chosen,
    i.e., the dominant operation should be used
  • Upper boundary assessment also possible
  • T(n) lt T1(n) T2(n) ? O(g1(n)g2(n))

Condition Example
  • Loop with condition (n is the problem size) for
    (int i 0 i lt n i) if (i
    0) block1 else block2
  • block1 is executed only once gt not relevant
  • (when not T(block2) gtgt n.T(block1) )
  • block2 is dominant
  • Complexity O(n.T(block2))

Procedure Calls
  • Procedures are analyzed separately, and their
    execution times inserted for each call
  • For recursive procedure calls a recurrence
    relation for T(n) must be found
  • Once again Find a closed form for the
    recursive relation (example follows shortly)

Analysis of simple Algorithms
  • Iterative Algorithms (today)
  • Composed of smaller parts ? sum rule
  • Consider loops ? multiplication rule
  • Recursive Algorithms (next lecture)
  • Time factors
  • Breaking a problem in several smaller ones
  • Solving the sub-problems
  • Recursive call of the method for solving the
  • Combining the solutions for the sub-problems

Example 1 Sequential Search
boolean contains (int A, int c) int n
A.length boolean found false for(int i
0, i lt n i) if (A i c) found
true return (found)
  • Cost consists of part a, and a part b which is
    repeated n times
  • T(n) abn
  • T(n) ? O(n)

Example 2 Selection Sort
void SelectionSort (int A) int
MinPosition, temp, i, j for (in-1 igt0 i--)
MinPosition i for (j0 jlti j) if (
Aj lt AMinPosition ) MinPosition j temp
Ai Ai AMinPosition AMinPosition
  • Inner loop is executed i times, i lt n gt upper
    boundary c.n
  • Outer loop is executed n times, constant costs b
  • Costs n.(bcn) bn cn2 gt O(n2)

? (Omega) - Notation
  • Analog to O(f) we have
  • ?(g) h ? cgt0 ? ngt0 ?ngtn h(n) ? c g(n)
  • Intuitively
  • ?(g) is the set of all functions that grow at
    least as strong as g
  • One says
  • if f ? ?(g), then g sets a lower bound for f.
  • Note f ? O(g) ? g ? ?(f)

Example ?-Notation
c2 g(n)
c2, n0 gt 0 such that f(n) ? (g(n))
g(n) sets an lower bound for f(n)
? (Theta) - Notation
  • With the sets O(g) and ?(g) we can define
  • ?(g) O(g) ? ?(g)
  • Intuitively
  • ?(g) is the set of functions that grow exactly
    as strong as g
  • Meaning if f ? O(g) and f ? ?(g) then f?? ?(g)
  • In this case one talks about an exact bound

Example ?-Notation
c1 g(n)
c2 g(n)
c1, c2, n0 gt 0 such that f(n) ? (g(n))
g(n) sets an exact bound for f(n)
Non-Asymptotic Execution Time
  • Algorithms with a higher asymptotic complexity
    can be more efficient for smaller problem sizes
  • Asymptotic execution time only holds for certain
    values of n
  • The constants do make a difference for smaller
    input sets

Complexity and Recursion
  • Up till now, weve seen only iterative algorithms
  • What about recursive algorithms?
  • Following week Refreshing recursion
  • Then Complexity Analysis with recurrence