Feedback Networks and Associative Memories - PowerPoint PPT Presentation

1 / 60
About This Presentation
Title:

Feedback Networks and Associative Memories

Description:

Feedback Networks and Associative Memories Content Introduction Discrete Hopfield NNs Continuous Hopfield ... – PowerPoint PPT presentation

Number of Views:348
Avg rating:3.0/5.0
Slides: 61
Provided by: TaiWe9
Category:

less

Transcript and Presenter's Notes

Title: Feedback Networks and Associative Memories


1
Feedback Networksand Associative Memories
  • ???

??????? ?????????
2
Content
  • Introduction
  • Discrete Hopfield NNs
  • Continuous Hopfield NNs
  • Associative Memories
  • Hopfield Memory
  • Bidirection Memory

3
Feedback Networksand Associative Memories
  • Introduction

??????? ?????????
4
Feedforward/Feedback NNs
  • Feedforward NNs
  • The connections between units do not form cycles.
  • Usually produce a response to an input quickly.
  • Most feedforward NNs can be trained using a wide
    variety of efficient algorithms.
  • Feedback or recurrent NNs
  • There are cycles in the connections.
  • In some feedback NNs, each time an input is
    presented, the NN must iterate for a potentially
    long time before it produces a response.
  • Usually more difficult to train than feedforward
    NNs.

5
Supervised-Learning NNs
  • Feedforward NNs
  • Perceptron
  • Adaline, Madaline
  • Backpropagation (BP)
  • Artmap
  • Learning Vector Quantization (LVQ)
  • Probabilistic Neural Network (PNN)
  • General Regression Neural Network (GRNN)
  • Feedback or recurrent NNs
  • Brain-State-in-a-Box (BSB)
  • Fuzzy Congitive Map (FCM)
  • Boltzmann Machine (BM)
  • Backpropagation through time (BPTT)

6
Unsupervised-Learning NNs
  • Feedforward NNs
  • Learning Matrix (LM)
  • Sparse Distributed Associative Memory (SDM)
  • Fuzzy Associative Memory (FAM)
  • Counterprogation (CPN)
  • Feedback or Recurrent NNs
  • Binary Adaptive Resonance Theory (ART1)
  • Analog Adaptive Resonance Theory (ART2, ART2a)
  • Discrete Hopfield (DH)
  • Continuous Hopfield (CH)
  • Discrete Bidirectional Associative Memory (BAM)
  • Kohonen Self-organizing Map/Topology-preserving
    map (SOM/TPM)

7
The Hopfield NNs
  • In 1982, Hopfield, a Caltech physicist,
    mathematically tied together many of the ideas
    from previous research.
  • A fully connected, symmetrically weighted network
    where each node functions both as input and
    output node.
  • Used for
  • Associated memories
  • Combinatorial optimization

8
Associative Memories
  • An associative memory is a content-addressable
    structure that maps a set of input patterns to a
    set of output patterns.
  • Two types of associative memory autoassociative
    and heteroassociative.
  • Auto-association
  • retrieves a previously stored pattern that most
    closely resembles the current pattern.
  • Hetero-association
  • the retrieved pattern is, in general, different
    from the input pattern not only in content but
    possibly also in type and format.

9
Associative Memories
Auto-association
A
Hetero-association
Niagara
Waterfall
10
Optimization Problems
  • Associate costs with energy functions in Hopfield
    Networks
  • Need to be in quadratic form
  • Hopfield Network finds local, satisfactory
    soluions, doesnt choose solutions from a set.
  • Local optimums, not global.

11
Feedback Networksand Associative Memories
  • Discrete Hopfield NNs

??????? ?????????
12
The Discrete Hopfield NNs
13
The Discrete Hopfield NNs
wij wji wii 0
14
The Discrete Hopfield NNs
wij wji wii 0
15
State Update Rule
  • Asynchronous mode
  • Update rule

Stable?
16
Energy Function
FactE is lower bounded (upper
bounded).
If E is monotonically decreasing (increasing),
the system is stable.
17
The Proof
Suppose that at time t 1, the kth neuron is
selected for update.
18
The Proof
Suppose that at time t 1, the kth neuron is
selected for update.
19
The Proof
20
The Proof
Stable
1
? 0
1
0
1
?1
lt 0
lt 0
?1
? 0
lt 0
1
?1
?1
0
lt 0
21
Feedback Networksand Associative Memories
  • Continuous Hopfield NNs

??????? ?????????
22
The Neuron of Continuous Hopfield NNs
23
The Dynamics
Gi
24
The Continuous Hopfield NNs
25
The Continuous Hopfield NNs
Stable?
26
Equilibrium Points
  • Consider the autonomous system
  • Equilibrium Points Satisfy

27
Lyapunov Theorem
Call E(y) as energy function.
The system is asymptotically stable if the
following holds
There exists a positive-definite function E(y)
s.t.
28
Lyapunov Energy Function
29
Lyapunov Energy Function
I1
I2
I3
In
w1n
w3n
w2n
wn3
w13
w23
w12
w32
wn2
w31
w21
wn1
u1
u2
u3
un
g1
C1
g2
C2
g3
C3
gn
Cn
. . .
. . .
v1
v2
v3
vn
v1
v2
v3
vn
30
Stability of Continuous Hopfield NNs
Dynamics
31
Stability of Continuous Hopfield NNs
Dynamics
gt 0
32
Stability of Continuous Hopfield NNs
Stable
33
Basins of Attraction
34
Basins of Attraction
35
Local/Global Minima
Energy Landscape
36
Feedback Networksand Associative Memories
  • Associative Memories

??????? ?????????
37
Associative Memories
  • Also named content-addressable memory.
  • Autoassociative Memory
  • Hopfield Memory
  • Heteroassociative Memory
  • Bidirection Associative Memory (BAM)

38
Associative Memories
Stored Patterns
Autoassociative
Heteroassociative
39
Feedback Networksand Associative Memories
  • Associative Memories
  • Hopfield Memory
  • Bidirection Memory

??????? ?????????
40
Hopfield Memory
Fully connected
14,400 weights
12?10 Neurons
41
Example
42
Example
Memory Association
43
Example
How to Store Patterns?
Memory Association
44
The Storage Algorithm
Suppose the set of stored pattern is of dimension
n.
?
45
The Storage Algorithm
46
Analysis
Suppose that x ? xi.
47
Example
48
Example
49
Example
E4
Stable
E0
E?4
50
Example
E4
Stable
E0
E?4
51
Problems of Hopfield Memory
  • Complement Memories
  • Spurious stable states
  • Capacity

52
Capacity of Hopfield Memory
  • The number of storable patterns w.r.t. the size
    of the network.
  • Study methods
  • Experiments
  • Probability
  • Information theory
  • Radius of attraction (?)

53
Capacity of Hopfield Memory
  • The number of storable patterns w.r.t. the size
    of the network.

Hopfield (1982) demonstrated that the maximum
number of patterns that can be stored in the
Hopfield model of n nodes before the error in the
retrieved pattern becomes severe is around
0.15n.  The memory capacity of the Hopfield
model can be increased as shown by Andrecut
(1972).
54
Radius of attraction (0 ? ? ? 1/2)
55
Feedback Networksand Associative Memories
  • Associative Memories
  • Hopfield Memory
  • Bidirection Memory

??????? ?????????
56
Bidirection Memory
Y Layer
Forward Pass
Backward Pass
Wwijn?m
X Layer
57
Bidirection Memory
Stored Patterns
Y Layer
?
Forward Pass
Backward Pass
Wwijn?m
X Layer
58
The Storage Algorithm
Stored Patterns
59
Analysis
Suppose xk is one of the stored vector.
?0
60
Energy Function
Write a Comment
User Comments (0)
About PowerShow.com