Chapter 3 contd' - PowerPoint PPT Presentation

PPT – Chapter 3 contd' PowerPoint presentation | free to view - id: dec2d-ZDc1Z

The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
Title:

Chapter 3 contd'

Description:

calculate variance for group 1. calculate variance for group 2. calculate weighted sum of group variances and remember which t gave rise to minimum. ... – PowerPoint PPT presentation

Number of Views:25
Avg rating:3.0/5.0
Slides: 41
Provided by: georgejgr
Category:
Tags:
Transcript and Presenter's Notes

Title: Chapter 3 contd'

1
Chapter 3 contd.
• Adjacency, Histograms, Thresholding

2
3
RAGs (Region Adjacency Graphs)
• Steps
• label image
• scan and enter adjacencies in graph
• (includes containment)

4
(No Transcript)
5
Define degree of a node. What is special about
nodes with degree 1?
6
But how do we obtain binary images?
7
Histograms Thresholding
8
Gray to binary
• Thresholding
• G ? B
• const int t200
• if (Grcgtt) Brc1
• else Brc0
• How do we choose t?
• Interactively
• Automatically

9
Gray to binary
• Interactively. How?
• Automatically.
• Many, many, many, , many methods.
• Experimentally (using a priori information).
• Supervised / training methods.
• Unsupervised
• Otsus method (among many, many, many, many,
other methods).

10
Histogram
• Probability of a given gray value in an image.
• h(g) count of pixels w/ gray value equal to g.
• p(g) h(g) / (wh)
• wh of pixels in entire image
• What are the range of possible values for p(g)?

11
Histogram
• Note Sometimes we need to group gray values
together in our histogram into bins or
buckets.
• E.g., we have 10 bins in our histogram and 100
possible different gray values. So we put 0..9
into bin 0, 10..19 into bin 1,

12
Histogram
13
Something is missing here!
14
Example of histogram
15
Example of histogram
We can even analyze the histogram just as we
analyze images. One common measure is entropy
16
Example of histogram
We can even analyze the histogram just as we
analyze images. One common measure is entropy
17
Calculating entropy
• Notes
• p(k) is in 0,1
• If p(k)0 then dont calculate log(p(k)). Why?
• My calculator only has log base 10. How do I
calculate log base 2?
• Why - to the left of the summation?

18
Example histograms
Same subject but different images and histograms
(because of difference in contrast).
19
Example of different thresholds
20
So how can we determine the threshold
automatically?
21
Otsus method
• Automatic thresholding method
• automatically picks t given an image histogram
• Assume 2 groups are present in the image
• Those that are ltt
• Those that are gtt

22
Otsus method
Best choices for t.
23
Otsus method
• For every possible t
• Pick a t.
• Calculate within group variances
• probability of being in group 1
• probability of being in group 2
• determine mean of group 1
• determine mean of group 2
• calculate variance for group 1
• calculate variance for group 2
• calculate weighted sum of group variances and
remember which t gave rise to minimum.

24
Otsus methodprobability of being in each group
25
Otsus methodmean of individual groups
26
Otsus methodvariance of individual groups
27
Otsus methodweighted sum of group variances
• Calculate for all ts and minimize.
• Demo Otsu.

28
(No Transcript)
29
Generalized thresholding
• Single range of gray values
• const int t1200
• const int t2500
• if (Grcgtt1 Grcltt2) Brc1
• else Brc0

30
Even more general thresholding
• Union of ranges of gray values.
• const int t1200, t2500
• const int t31200, t41500
• if (Grcgtt1 Grcltt2) Brc1
• else if (Grcgtt3 Grcltt4) Brc1
• else Brc0

31
Something is missing here!
32
K-Means Clustering
• Clustering the process of partitioning a set of
pattern vectors into subsets called clusters.
• K number of clusters (known in advance).
• Not an exhaustive search so it may not find the
globally optimal solution.
• (see section 10.1.1)

33
Iterative K-Means Clustering Algorithm
• Form K-means clusters from a set of nD feature
vectors.
• Set ic1 (iteration count).
• Choose randomly a set of K means m1(1), m2(1),
mK(1).
• For each vector xi compute D(xi,mj(ic)) for each
j1,,K.
• Assign xi to the cluster Cj with the nearest
mean.
• ic ic1 update the means to get a new set
m1(ic), m2(ic), mK(ic).
• Repeat 3..5 until Cj(ic1) Cj(ic) for all j.

34
K-Means for Optimal Thresholding
• What are the features?

35
K-Means for Optimal Thresholding
• What are the features?
• Individual pixel gray values

36
K-Means for Optimal Thresholding
• What value for K should be used?

37
K-Means for Optimal Thresholding
• What value for K should be used?
• K2 to be like Otsus method.

38
Iterative K-Means Clustering Algorithm
• Form 2 clusters from a set of pixel gray values.
• Set ic1 (iteration count).
• Choose 2 random gray values as our initial K
means, m1(1), and m2(1).
• For each pixel gray value xi compute
fabs(xi,mj(ic)) for each j1,2.
• Assign xi to the cluster Cj with the nearest
mean.
• ic ic1 update the means to get a new set
m1(ic), m2(ic), mK(ic).
• Repeat 3..5 until Cj(ic1) Cj(ic) for all j.

39
Iterative K-Means Clustering Algorithm
• Example.
• m1(1)260.83, m2(1)539.00
• m1(2)39.37, m2(2)1045.65
• m1(3)52.29, m2(3)1098.63
• m1(4)54.71, m2(4)1106.28
• m1(5)55.04, m2(5)1107.24
• m1(6)55.10, m2(6)1107.44
• m1(7)55.10, m2(7)1107.44
• .
• .
• .
• Demo K-means.

40
Otsu vs. K-Means
• Otsus method as presented determines the single
best threshold.
• How many objects can it discriminate?
• Suggest a modification to discriminate more.
• How is Otsus method similar to K-Means?