Title: Graph Embedding: A General Framework for Dimensionality Reduction
1 Graph Embedding A General Framework for
Dimensionality Reduction
- Dong XU
- School of Computer Engineering
- Nanyang Technological University
- http//www.ntu.edu.sg/home/dongxu
- dongxu_at_ntu.edu.sg
2What is Dimensionality Reduction?
PCA
LDA
Examples 2D space to 1D space
3What is Dimensionality Reduction?
Example 3D space to 2D space
ISOMAP Geodesic Distance Preserving J. Tenenbaum
et al., 2000
4Why Conduct Dimensionality Reduction?
LPP, 2003 He et al.
Expression Variation
- Visualization
- Feature Extraction
- Computation Efficiency
- Broad Applications
- Face Recognition
- Human Gait Recognition
- CBIR
Pose Variation
- Uncover intrinsic structure
5Representative Previous Work
PCA
LDA
ISOMAP Geodesic Distance Preserving J.
Tenenbaum et al., 2000
LLE Local Neighborhood Relationship
Preserving S. Roweis L. Saul, 2000
LE/LPP Local Similarity Preserving, M. Belkin,
P. Niyogi et al., 2001, 2003
6Dimensionality Reduction Algorithms
Statistics-based
Geometry-based
PCA/KPCA
ISOMAP
LLE
LE/LPP
LDA/KDA
Matrix
Tensor
- Any common perspective to understand and explain
these dimensionality reduction algorithms? Or any
unified formulation that is shared by them? - Any general tool to guide developing new
algorithms for dimensionality reduction?
7Our Answers
Google Citation 174 (until 15-Sep-2009)
Direct Graph Embedding
Linearization
Kernelization
Original PCA LDA, ISOMAP, LLE, Laplacian
Eigenmap
PCA, LDA, LPP
KPCA, KDA
Tensorization
Type
Formulation
CSA, DATER
Example
S. Yan, D. Xu, H. Zhang et al., CVPR 2005 and
T-PAMI 2007
8Direct Graph Embedding
Intrinsic Graph
S, SP Similarity matrix (graph edge)
Similarity in high dimensional space
L, B Laplacian matrix from S, SP
Data in high-dimensional space and
low-dimensional space (assumed as 1D space here)
Penalty Graph
9Direct Graph Embedding -- Continued
Intrinsic Graph
S, SP Similarity matrix (graph edge)
L, B Laplacian matrix from S, SP
Similarity in high dimensional space
Data in high-dimensional space and
low-dimensional space (assumed as 1D space here)
Criterion to Preserve Graph Similarity
Penalty Graph
Special case B is Identity matrix (Scale
normalization)
Problem It cannot handle new test data.
10Linearization
Intrinsic Graph
Linear mapping function
Penalty Graph
Objective function in Linearization
Problem linear mapping function is not enough to
preserve the real nonlinear structure?
11Kernelization
Intrinsic Graph
Nonlinear mapping
the original input space to another higher
dimensional Hilbert space.
Penalty Graph
Constraint
Kernel matrix
Objective function in Kernelization
12Tensorization
Low dimensional representation is obtained as
Intrinsic Graph
Penalty Graph
Objective function in Tensorization
where
13Common Formulation
S, SP Similarity matrix
Intrinsic graph
L, B Laplacian matrix from S, SP
Penalty graph
Direct Graph Embedding
Linearization
Kernelization
Tensorization
where
14A General Framework for Dimensionality Reduction
D Direct Graph Embedding L Linearization K
Kernelization T Tensorization
15New Dimensionality Reduction Algorithm Marginal
Fisher Analysis
Important Information for face recognition 1)
Label information 2) Local manifold structure
(neighborhood or margin)
1 if xi is among the k1-nearest neighbors of
xj in the same class 0 otherwise
1 if the pair (i,j) is among the k2 shortest
pairs among the data set 0 otherwise
16Marginal Fisher Analysis Advantage
- No Gaussian distribution assumption
17Experiments Face Recognition
18Summary
- Optimization framework that unifies previous
dimensionality reduction algorithms as special
cases. - A new dimensionality reduction algorithm
Marginal Fisher Analysis.