Organizing a spectral image database by using SelfOrganizing Maps - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

Organizing a spectral image database by using SelfOrganizing Maps

Description:

Image retrieval from conventional databases since 1990's ... many efficient techniques ... Metameric imaging: cheap and practical way to achieve a color match. ... – PowerPoint PPT presentation

Number of Views:42
Avg rating:3.0/5.0
Slides: 31
Provided by: bmf7
Category:

less

Transcript and Presenter's Notes

Title: Organizing a spectral image database by using SelfOrganizing Maps


1
Organizing a spectral image database by using
Self-Organizing Maps
Research Seminar 7.10.2005 Oili Kohonen
2
Motivation?
  • Image retrieval from conventional databases
    since 1990's ... many efficient
    techniques have been developed
  • However, efficient techniques for querying
    images from spectral image
    database does not exist.
  • Due to the high amount of data in the case of
    spectral images, the efficient techniques
    will be needed.

3
Spectral imaging?

Metameric imaging cheap and practical way to
achieve a color
match. Spectral imaging needed
to achieve a color match for all
observers across
the changes in the illumination.
4
Principle of SOM
  • The Self-Organizing Map (SOM) algorithm
  • Is an unsupervised learning algorithm.
  • Defines mapping from high-dimensional data into
    lower-dimensional data.
  • SOM
  • Consists of arranged units (or neurons), which
    are represented by weight vectors.
  • Units are connected to each other by neighborhood
    relation.

5
Principle of SOM
  • SOM Algorithm
  • begin
  • Initialize the SOM
  • for i 1 number of epochs
  • take input vector x randomly from the training
    data
  • find the BMU for x
  • update the weight vectors of the map
  • decrease the learning rate neighborhood
    function
  • end
  • end

6
Principle of SOM finding the BMU
Euclidean distance is a typically used distance
measure.
7
Principle of SOM updating the weight vectors
Learning rate product of learning rate parameter
neighborhood function
8
Principle of SOM neighborhood function

  • Neighborhood function
  • h(t) has to fullfill the following two
    requirements
  • It has to be symmetric about the maximum
    point (BMU).
  • It's amplitude has to decrease monotonically
    with an increasing distance from BMU.

9
Principle of SOM Lattice structure
Lattice structures hexagonal rectangular
10
Searching Technique Constructing histogram
database
  • Train SOM
  • Find BMU for each pixel in an image
  • Generate BMU-histogram normalize it by the
    number of pixels in an image
  • Repeat steps 2 3 for all images in a spectral
    image database
  • Save histogram database with the information of
    SOM-map

11
Searching Technique making a search
  • Choose an image and generate its histogram.
  • Calculate the distances between the generated
    histogram and the existing histogram
    database.
  • Order images by these distances.

The results of the search are shown to user as
RGB-images
12
Searching techniques
One-dimensional SOM
13
Searching techniques
Two-dimensional histogram-trained SOM
14
Distance Calculations
15
Experiments
(Unweighted images)
(Unweighted and weighted images)
16
The Used Database
17
Training of the SOMs
  • 10 000 spectra were selected randomly from each
    image.
  • 2 000 000 4 000 000 epochs in ordering fine
    tuning phases, respectively.
  • Unit sizes 50 chosen empirically

    49 to have comparable results with 1D-SOM
    1414 map in the case of
    histogram-trained
    SOM

18
Results 1d-SOM, Unweighted images
Multiplied data
Pure data
The distance measure Euclidean distance
19
Results 1D, Unweighted images
  • Energy
  • K-L
  • Peak
  • DPD
  • JD

20
Results 1D, Weighted images
  • Energy
  • K-L
  • Peak
  • DPD
  • JD

21
Conclusions I
  • The structure of the database is different for
    weighted and unweighted images.
  • The best results were got by using euclidean
    distance and Jeffrey divergence.
  • Importance of normalization??

    Better results with Euclidean distance
    DPD Worse results
    with Jeffrey divergence

22
Results 2D, Unweighted spectral data
  • Euclidean
  • Energy
  • K-L
  • Peak
  • DPD
  • JD

23
Results 2D, Weighted spectral data
  • Euclidean
  • Energy
  • K-L
  • Peak
  • DPD
  • JD

24
Conclusions II
  • In the case of two-dimensional SOM better
    results are achieved by using non-weighted
    images.
  • When the weighted images are used, the use of
    1D- SOM seems to be more reasonable.

25
Results histogram-trained 2D-SOM
  • Euclidean
  • Energy
  • K-L
  • Peak
  • DPD
  • JD

26
Connections between images and histograms
non-weighted
weighted
27
Past, Present Future
  • Past What you have seen so far...
  • Present Texture features in addition to color
    features
  • Future Testing the effect of different metrics
    in ordering and fine-tuning
    phases (during the training of SOM)

28
Questions
29

30
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com