Pattern Recognition: Statistical and Neural - PowerPoint PPT Presentation

About This Presentation
Title:

Pattern Recognition: Statistical and Neural

Description:

Nanjing University of Science & Technology. Lecture 18 Topics ... Algorithm converged in 1.75 passes through the data to give final discriminant function as ... – PowerPoint PPT presentation

Number of Views:28
Avg rating:3.0/5.0
Slides: 36
Provided by: Lude
Category:

less

Transcript and Presenter's Notes

Title: Pattern Recognition: Statistical and Neural


1
Nanjing University of Science Technology
Pattern RecognitionStatistical and Neural
Lonnie C. Ludeman Lecture 18 Oct 21, 2005
2
Lecture 18 Topics
1. Example Generalized Linear Discriminant
Function 2. Weight Space 3. Potential Function
Approach- 2 class case 4. Potential Function
Example- 2 class case 5. Potential Function
Algorithm M class case
3
from C1
Classes not Linearly separable
from C2
2
1
3
4
x1
1
2
-1
-2
Q. How can we find decision boundaries??
Answers
(1) Use Generalized Linear Discriminant functions
(2) Use Nonlinear Discriminant Functions
4
Example Generalized Linear Discriminant
Functions
x2
from C1
from C2
3
2
1
3
4
x1
1
2
-1
-2
Given Samples from 2 Classes
5
Find a generalized linear discriminant function
that separates the classes
Solution
d(x) w1f1(x) w2f2(x) w3f3(x)
w4f4(x) w5f5(x) w6f6(x)
wT f (x)
in the f space (linear)
6
where
in the original pattern space (nonlinear)
7
Use the Perceptron Algorithm in the f space
(the iterations follow)
Iteration
Samples
Action
Weights
8
d(x)
Iterations Continue
Iterations Stop
9
The discriminant function is as follows
Decision boundary set d(x) 0
Putting in standard form we get the decision
boundary as the following ellipse
10
Decision Boundary in original pattern space
x2
from C1
2
from C2
1
3
4
x1
1
2
-1
Boundary
d(x) 0
-2
11
Weight Space
To separate two pattern classes C1 and C2 by a
hyperplane we must satisfy the following
conditions
Where wT x 0 specifies the boundary between
the classes
12
But we know that wTx xTw
Thus we could now write the equations in the w
space with coefficients representing the samples
as follows
Each inequality gives a hyperplane boundary in
the weight space such that weights on the
positive side would satisfy the inequality
13
In the Weight Space
14
View of the Pereptron algorithm in the weight
space
15
Potential Function Approach Motivated by
electromagnetic theory
from C1
- from C2
Sample space
16
Given Samples x from two classes C1 and C2
S1 S2
C1
C2
Define Total Potential Function
K(x) ? K(x, xk) - ? K(x, xk)
xk S1
xk S2
Potential Function
Decision Boundary
K(x) 0
17
Choices for Potential functions K(x, xk)
18
Graphs of Potential functions
19
(No Transcript)
20
(No Transcript)
21
(No Transcript)
22
Example Using Potential functions
Given the following Patterns from two classes
Find a nonlinear Discriminant function using
potential functions that separate the classes
23
Plot of Samples from the two classes
24
Trace of Iterations
25
Algorithm converged in 1.75 passes through the
data to give final discriminant function as
26
KFINAL(x)
27
x1
28
Potential Function Algorithm for K Classes
Reference (3) Tou And Gonzales
29
(No Transcript)
30
(No Transcript)
31
Flow Chart for Potential Function Method M-Class
32
Flow Chart Continued
33
Flow Chart Continued
34
Summary
1. Example Generalized Linear Discriminant
Function 2. Weight Space 3. Potential Function
Approach- 2 class case 4. Potential Function
Example- 2 class case 5. Potential Function
Algorithm M class case
35
End of Lecture 18
Write a Comment
User Comments (0)
About PowerShow.com