Memory - PowerPoint PPT Presentation

1 / 70
About This Presentation
Title:

Memory

Description:

Spiking neurons have monotonic I-f curves (which saturate, but only at very high ... Vertical Ret. Pos. ( deg) Activity. A. B -DE. Neural Integrator. Oculomotor theory ... – PowerPoint PPT presentation

Number of Views:48
Avg rating:3.0/5.0
Slides: 71
Provided by: Alexandr201
Category:
Tags: memory | ret

less

Transcript and Presenter's Notes

Title: Memory


1
Memory
2
Hopfield Network
  • Content addressable
  • Attractor network

3
Hopfield Network
4
Hopfield Network
  • General Case
  • Lyapunov function

5
Neurophysiology
6
Mean Field Approximation
7
Null Cline Analysis
E
I
CE
CI
  • What are the fixed points?

8
Null Cline Analysis
  • What are the fixed points?

9
Null Cline Analysis
Unstable fixed point
E
Stable fixed point
10
Null Cline Analysis
E
E
11
Null Cline Analysis
I
E
E
12
Null Cline Analysis
I
E
E
13
Null Cline Analysis
I
E
E
14
Null Cline Analysis
Stable branches
I
Unstable branch
E
E
15
Null Cline Analysis
I
E
E
16
Null Cline Analysis
I
Stable fixed point
E
I
17
Null Cline Analysis
I
E
I
18
Null Cline Analysis
I
E
I
19
Null Cline Analysis
Inhibitory null cline
I
Excitatory null cline
E
Fixed points
20
Binary Memory
I
E
21
Binary Memory
Storing
I
Decrease inhibition (CI)
E
22
Binary Memory
Storing
I
Back to rest
E
23
Binary Memory
Reset
I
Increase inhibition
E
24
Binary Memory
Reset
I
Back to rest
E
25
Networks of Spiking Neurons
  • Problems with the previous approach
  • Spiking neurons have monotonic I-f curves (which
    saturate, but only at very high firing rates)
  • How do you store more than one memory?
  • What is the role of spontaneous activity?

26
Networks of Spiking Neurons
27
Networks of Spiking Neurons
Ij
R(Ij)
28
Networks of Spiking Neurons
29
Networks of Spiking Neurons
  • A memory network must be able to store a value in
    the absence of any input

30
Networks of Spiking Neurons
31
Networks of Spiking Neurons
cR(Ii)
Ii
Iaff
32
Networks of Spiking Neurons
  • With a non saturating activation function and no
    inhibition, the neurons must be spontaneously
    active for the network to admit a non zero stable
    state

cR(Ii)
I2
Ii
33
Networks of Spiking Neurons
  • To get several stable fixed points, we need
    inhibition

Unstable fixed point
Stable fixed points
I2
Ii
34
Networks of Spiking Neurons
  • Clamping the input inhibitory Iaff

Ii
Iaff
35
Networks of Spiking Neurons
  • Clamping the input excitatory Iaff

cR(Ii)
Ii
I2
Iaff
36
Networks of Spiking Neurons
Ij
R(Ij)
37
Networks of Spiking Neurons
  • Major Problem the memory state has a high firing
    rate and the resting state is at zero. In
    reality, there is spontaneous activity at 0-10Hz
    and the memory state is around 10-20Hz (not
    100Hz)
  • Solution you dont want to know (but it involves
    a careful balance of excitation and inhibition)

38
Line Attractor Networks
  • Continuous attractor line attractor or
    N-dimensional attractor
  • Useful for storing analog values
  • Unfortunately, its virtually impossible to get a
    neuron to store a value proportional to its
    activity

39
Line Attractor Networks
  • Storing analog values difficult with this
    scheme.

cR(Ii)
Ii
40
Line Attractor Networks
  • Implication for transmitting rate and
    integration

cR(Ii)
Ii
41
Line Attractor Networks
  • Head direction cells

DH
100
80
60
Activity
40
20
0
-100
0
100
Preferred Head Direction (deg)
42
Line Attractor Networks
  • Attractor network with population code
  • Translation invariant weights

DH
100
80
60
Activity
40
20
0
-100
0
100
Preferred Head Direction (deg)
43
Line Attractor Networks
  • Computing the weights

44
Line Attractor Networks
  • The problem with the previous approach is that
    the weights tend to oscillate. Instead, we
    minimize
  • The solution is

45
Line Attractor Networks
  • Updating of memory bias in the weights,
    integrator of velocityetc.

46
Line Attractor Networks
  • How do we know that the fixed points are stable?
    With symmetric weights, the network has a
    Lyapunov function (Cohen, Grossberg 1982)

47
Line Attractor Networks
  • Line attractor the set of stable points forms a
    line in activity space.
  • Limitations Requires symmetric weights
  • Neutrally stable along the attractor unavoidable
    drift

48
Memorized Saccades


T1
T2
49
Memorized Saccades


R2
R1
T1
T2
S1
R2
S2
S1R1 S2R2-S1
50
Memorized Saccades


R2
R1
T1
T2
S1
S2
S2
S1
T2
T1
51
Memorized Saccades


R2
R1
T1
T2
S1
S2
S2
S1
T2
T1
52
Memorized Saccades
53
Neural Integrator
  • Oculomotor theory
  • Evidence integrator for decision making
  • Transmitting rates in multilayer networks
  • Maximum likelihood estimator

54
Semantic Memory
  • Memory of words is sensitive to semantic (not
    just spelling)
  • Experiment Subjects are first trained to
    remember a list of words. A few hours later, they
    are presented with a list of words and they have
    to pick the ones they were supposed to remember.
    Many mistakes involve words semantically related
    to the remembered words.

55
Semantic Memory
  • Usual solution semantic networks (nodes words,
    links semantic similarities) and spreading
    activation
  • Problem 1 The same word can have several
    meanings (e.g. bank). This is not captured by
    semantic network
  • Problem 2 some interaction between words are
    negative, even when they have no semantic
    relationship (e.g. doctor and hockey).

56
Semantic Memory
  • Usual solution semantic networks (nodes words,
    links semantic similarities) and spreading
    activation

57
Semantic Memory
  • Bayesian approach (Griffiths, Steyvers,
    Tenenbaum, Psych Rev 06)
  • Documents are bags of words (we ignore word
    ordering).
  • Generative model for document. Each document has
    a gist which is a mixture of topics. A topic in
    turn defines a probability distribution over
    words.

58
Semantic Memory
  • Bayesian approach
  • Generative model for document

g
z
w
Gist
Topics
words
59
Semantic Memory
  • z Topics finance, english country side etc.
  • Gist mixture of topics. P(zg) mixing
    proportions.
  • Some documents might be 0.9 finance, 0.1 english
    country side (e.g. wheat market).
  • P(zfinanceg1)0.9, P(zengl countryg1)0.1
  • Other might be 0.2 finance, 0.8 english country
    side (e.g. Lloyds CEO buys a mansion)
  • P(zfinanceg1)0.2, P(zengl countryg1)0.8

60
Semantic Memory
  • Bayesian approach
  • Generative model for document

g
z
w
Gist
Topics
words
61
Semantic Memory
  • Topic (z1)finance
  • Words P(wz1)
  • 0.01 bank, 0.008 money, 0.0 meadow
  • Topic (z2)english country side
  • Words P(wz2)
  • 0.001 bank, 0.001 money, 0.002 meadow

62
  • The gist is shared within a document but the
    topics can be varied from one sentence (or even
    word) to the next.

63
Semantic Memory
  • Problem we only observe the words, not the topic
    of the gist
  • How do we know how many topics and how many gists
    to pick to account for a corpus of words, and how
    do we estimate their probabilities?
  • To pick the number of topics and gist Chinese
    restaurant process, Dirichlet process and
    hierarchical Dirichlet process. MCMC sampling.
  • Use techniques like EM to learn the probability
    for the latent variables (topics and gists).
  • However, a human is still needed to label the
    topics

64
Semantic Memory
Words in Topic 1
Words in Topic 3
Words in Topic 2
65
Semantic Memory
  • Bayesian approach
  • Generative model for document

g
z
w
Gist
Topics
words
66
Semantic Memory
  • Problems we may want to solve
  • Prediction P(wn1w). Whats the next word?
  • Disambiguation P(zw). What are the mixture of
    topics in this document?
  • Gist extraction P(gw). Whats the probability
    distribution over gists?

67
Semantic Memory
  • What we need is a representation of P(w,z,g)

68
Semantic Memory
  • P(w,z,g) is given by the generative model.

69
Semantic Memory
  • Explain semantic interferences in list
  • will tend to favor words that are
    semantically related through the topics and
    gists.
  • Capture the fact that a given word can have
    different meanings (topics and gists) depending
    on the context.

70
Countryside
Word being observed
Finance
Predicted next word
Money less likely to be seen if the topic is
country side
Write a Comment
User Comments (0)
About PowerShow.com