Pattern Recognition MM7 - PowerPoint PPT Presentation

1 / 18
About This Presentation
Title:

Pattern Recognition MM7

Description:

The methods are not used so much any more, but can be seen in older reports/papers ... Histogram inspection (qualitative analysis) Skewness and kurtosis (rule ... – PowerPoint PPT presentation

Number of Views:67
Avg rating:3.0/5.0
Slides: 19
Provided by: thomasm46
Category:

less

Transcript and Presenter's Notes

Title: Pattern Recognition MM7


1
Pattern Recognition MM7
  • Feature evaluation
  • What to consider when choosing features
  • Is a feature robust?
  • How many samples do we need to represent a
    feature (mean and covariance)?
  • Is the feature normal distributed?
  • Break
  • Dependency between features
  • Covariance
  • Correlation
  • How are they related
  • Mini-project

2
Number of samples
  • How many samples do we need to describe a feature
    for a class?
  • 1) Scientific Table
  • 2) Variance analysis

3
Number of samples Scientific Table
  • Distribution-free tolerance limits
  • Which number of samples, N, is required in order
    to ensure that bp of the population is within
    the min. and max. values, with a confidence of bt
    ?
  • K. Diem and C. Leitner. Scientific Tables.
    Ciba-Cjeigy Ltd. 1975
  • Typically 95 for both gt 93 samples
  • Often samples are normally distributed gt fewer
    samples are required

4
Number of samples Variance analysis
  • Plot the variance as a function of N
  • Choose N so the variance is stable
  • Do this for each feature in each class

5
Is a feature normally distributed?
  • If the features defining a class are normally
    distributed then Bayes classifier is reduced to
    Mahalanobis distance
  • In praxis it is often assumed that all features
    are normally distributed, but how do we test
    this?
  • 1) Histogram inspection
  • 2) Skewness and kurtosis
  • 3) Goodness of fit

6
Is a feature normally distributed?
  • 1) Histogram inspection
  • Matlab normplot

7
Is a feature normally distributed?
  • 2) Skewness and Kurtosis
  • One feature and one class
  • A distributions ith moment (mi) can be
    expressed as

8
Is a feature normally distributed?
  • 2) Skewness and Kurtosis
  • The methods are not used so much any more, but
    can be seen in older reports/papers
  • BUT they do describe general aspects for a
    distribution AND can be used as features !

9
AAU laver milliardaftale med GE Healthcare (12.
okt 2005) Aalborg Universitet (AAU) har indgået
en licens- og produktionsaftale med GE
Healthcare, som vil generere indtægter mellem 0,5
og 1 mia. kr. Licensen drejer sig om en ny
opfindelse, der gør det nemmere at opdage
hjertesygdommen Long QT-syndrom, der hvert år
rammer millioner af mennesker på verdensplan. Det
er en gruppe studerende fra Institut for
Sundhedsteknologi, der har udviklet
måleapparatet, og instituttet vil modtage en
tredjedel af pengene fra aftalen. AAU modtager en
anden tredjedel, mens de tre studerende og en
række lærere deler den sidste tredjedel af
beløbet.
Millionaftale til Aalborg Universitet (21. okt
2005) Tre nyuddannede ingeniører fra Aalborg
Universitets sundhedsteknologiske uddannelse har
patenteret en metode til at diagnosticere en
farlig hjertesygdom. En af verdens største
leverandører af hospitalsudstyr, General
Healthcare, har underskrevet en millionaftale om
at benytte sig af teknologien. Videnskabsminister,
Helge Sander kalder aftalen for den største
nogensinde mellem et universitet og et privat
firma, skriver Ingeniøren.
10
Is a feature normally distributed?
  • 3) (Goodness of fit or c2-test (chi))
  • Idea Compare data with a perfect normal
    distribution
  • Algorithm
  • a) Divide in k intervals (k as small as
    possible)
  • - Choose k so fi gt 1 for all i and fi gt 5
    for 80 of the k
  • - Choose k so each interval approx. Has
    the same probability
  • b) Compare the measured data with the expected
    data
  • - Error measure T
  • - T is c2 distributed
  • c) If T lt THa gt normal distributed with
    significant level a (see stat. table)

11
What to remember ?
  • Feature evaluation
  • Robustness (invariant wrt. the application)
  • Number of samples
  • Scientific Table
  • Variance analysis
  • Normally distributed (Bayes rule)
  • Histogram inspection (qualitative analysis)
  • Skewness and kurtosis (rule of thumb)
  • Goodness of fit (statistical analysis)

12
Break
13
Dependency between features
  • Feature evaluation
  • What to consider when choosing features
  • Is a feature robust?
  • How many samples do we need to represent a
    feature (mean and covariance)?
  • Is the feature normal distributed?
  • Break
  • Dependency between features
  • Covariance
  • Correlation
  • How are they related

14
The covariance
15
(No Transcript)
16
Relationship between covariance and correlation
(ignore mean)
17
What to remember ?
  • Dependency between features
  • Some features might express the same information
  • How to evaluate that
  • Covariance
  • Correlation
  • How are they related

18
Mini Project
  • The idea
  • The idea behind the mini project is that each
    group will study a particular method and present
    it for the other groups at lecture 10Topics
  • Principal Component Analysis
  • SEPCOR 
  • Isomap 
  • Local linear embedding 
  • Plan
  • MM7 Feature evaluation / Feature dependence
  • MM8 Dimensionality reduction of the feature
    space I (NO LECTURE)
  • MM9 Dimensionality reduction of the feature
    space II (Start 9.15)
  • MM10 String matching / Test of pattern
    recognition systems   (Start 9.15)
Write a Comment
User Comments (0)
About PowerShow.com