Graph Embedding: A General Framework for Dimensionality Reduction - PowerPoint PPT Presentation

1 / 18
About This Presentation
Title:

Graph Embedding: A General Framework for Dimensionality Reduction

Description:

Objective function in Linearization. Intrinsic Graph. Penalty Graph ... L: Linearization. K: Kernelization. T: Tensorization ... PCA LDA (Linearization) Summary ... – PowerPoint PPT presentation

Number of Views:382
Avg rating:3.0/5.0
Slides: 19
Provided by: Don128
Category:

less

Transcript and Presenter's Notes

Title: Graph Embedding: A General Framework for Dimensionality Reduction


1
Graph Embedding A General Framework for
Dimensionality Reduction
  • Dong XU
  • School of Computer Engineering
  • Nanyang Technological University
  • http//www.ntu.edu.sg/home/dongxu
  • dongxu_at_ntu.edu.sg

2
What is Dimensionality Reduction?
PCA
LDA
Examples 2D space to 1D space
3
What is Dimensionality Reduction?
Example 3D space to 2D space
ISOMAP Geodesic Distance Preserving J. Tenenbaum
et al., 2000
4
Why Conduct Dimensionality Reduction?
LPP, 2003 He et al.
Expression Variation
  • Visualization
  • Feature Extraction
  • Computation Efficiency
  • Broad Applications
  • Face Recognition
  • Human Gait Recognition
  • CBIR

Pose Variation
  • Uncover intrinsic structure

5
Representative Previous Work
PCA
LDA
ISOMAP Geodesic Distance Preserving J.
Tenenbaum et al., 2000
LLE Local Neighborhood Relationship
Preserving S. Roweis L. Saul, 2000
LE/LPP Local Similarity Preserving, M. Belkin,
P. Niyogi et al., 2001, 2003
6
Dimensionality Reduction Algorithms
Statistics-based
Geometry-based

PCA/KPCA
ISOMAP
LLE
LE/LPP

LDA/KDA
Matrix
Tensor
  • Any common perspective to understand and explain
    these dimensionality reduction algorithms? Or any
    unified formulation that is shared by them?
  • Any general tool to guide developing new
    algorithms for dimensionality reduction?

7
Our Answers
Google Citation 174 (until 15-Sep-2009)
Direct Graph Embedding
Linearization
Kernelization
Original PCA LDA, ISOMAP, LLE, Laplacian
Eigenmap
PCA, LDA, LPP
KPCA, KDA
Tensorization
Type
Formulation
CSA, DATER
Example
S. Yan, D. Xu, H. Zhang et al., CVPR 2005 and
T-PAMI 2007
8
Direct Graph Embedding
Intrinsic Graph
S, SP Similarity matrix (graph edge)
Similarity in high dimensional space
L, B Laplacian matrix from S, SP
Data in high-dimensional space and
low-dimensional space (assumed as 1D space here)
Penalty Graph
9
Direct Graph Embedding -- Continued
Intrinsic Graph
S, SP Similarity matrix (graph edge)
L, B Laplacian matrix from S, SP
Similarity in high dimensional space
Data in high-dimensional space and
low-dimensional space (assumed as 1D space here)
Criterion to Preserve Graph Similarity
Penalty Graph
Special case B is Identity matrix (Scale
normalization)
Problem It cannot handle new test data.
10
Linearization
Intrinsic Graph
Linear mapping function
Penalty Graph
Objective function in Linearization
Problem linear mapping function is not enough to
preserve the real nonlinear structure?
11
Kernelization
Intrinsic Graph
Nonlinear mapping
the original input space to another higher
dimensional Hilbert space.
Penalty Graph
Constraint
Kernel matrix
Objective function in Kernelization
12
Tensorization
Low dimensional representation is obtained as
Intrinsic Graph
Penalty Graph
Objective function in Tensorization
where
13
Common Formulation
S, SP Similarity matrix
Intrinsic graph
L, B Laplacian matrix from S, SP
Penalty graph
Direct Graph Embedding
Linearization
Kernelization
Tensorization
where
14
A General Framework for Dimensionality Reduction
D Direct Graph Embedding L Linearization K
Kernelization T Tensorization
15
New Dimensionality Reduction Algorithm Marginal
Fisher Analysis
Important Information for face recognition 1)
Label information 2) Local manifold structure
(neighborhood or margin)
1 if xi is among the k1-nearest neighbors of
xj in the same class 0 otherwise
1 if the pair (i,j) is among the k2 shortest
pairs among the data set 0 otherwise
16
Marginal Fisher Analysis Advantage
  • No Gaussian distribution assumption

17
Experiments Face Recognition
18
Summary
  • Optimization framework that unifies previous
    dimensionality reduction algorithms as special
    cases.
  • A new dimensionality reduction algorithm
    Marginal Fisher Analysis.
Write a Comment
User Comments (0)
About PowerShow.com