Title: On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor functions
1On Bubbles and DriftsContinuous attractor
networks and their relation to working memory,
path integration, population decoding, attention,
and motor functions
- Thomas Trappenberg
- Dalhousie University, Canada
2CANNs can implement motor functions
Stringer, Rolls, Trappenberg, de Araujo,
Self-organizing continuous attractor networks
and motor functions Neural Networks 16 (2003).
3My plans for this talk
- Basic CANN model
- Idiothetic CANN updates (path-integtration)
- CANN motor functions
-
- Limits on NMDA stabilization
4Once upon a time ... (my CANN shortlist)
- Wilson Cowan (1973)
- Grossberg (1973)
- Amari (1977)
-
- Sampolinsky Hansel (1996)
- Zhang (1997)
-
- Stringer et al (2002)
Wilshaw van der Malsburg (1976)
Droulez Berthos (1988)
Redish, Touretzky, Skaggs, etc
5Basic CANN Its just a Hopfield net
Recurrent architecture
Synaptic weights
Nodes can be scrambled!
6In mathematical terms
Updating network states (network dynamics)
Gain function
Weight kernel
7Network can form bubbles of persistent activity
(in Oxford English activity packets)
8Space is represented with activity packets in the
hippocampal system
From Samsonovich McNaughton Path integration
and cognitive mapping in a continuous attractor
neural J. Neurosci. 17 (1997)
9Various gain functions are used
End states
10Superior colliculus intergrates exogenous and
endogenous inputs
11Superior Colliculus is a CANN
Trappenberg, Dorris, Klein Munoz, A model of
saccade initiation based on the competitive
integration of exogenous and endogenous inputs
J. Cog. Neuro. 13 (2001)
12Weights describe the effective interaction in
Superior Colliculus
Trappenberg, Dorris, Klein Munoz, A model of
saccade initiation based on the competitive
integration of exogenous and endogenous inputs
J. Cog. Neuro. 13 (2001)
13There are phase transitions in the
weight-parameter space
14CANNs can be trained with Hebb
Hebb
Training pattern
15Normalization is important to have convergent
method
- Random initial states
- Weight normalization
w(x,y)
w(x,50)
x
x
y
Training time
16Gradient-decent learning is also possible (Kechen
Zhang)
Gradient decent with regularization Hebb
weight decay
17CANNs have a continuum of point attractors
Point attractors and basin of attraction
Line of point attractors
Can be mixed Rolls, Stringer, Trappenberg A
unified model of spatial and episodic
memory Proceedings B of the Royal Society
2691087-1093 (2002)
18CANNs work with spiking neurons
Xiao-Jing Wang, Trends in Neurosci. 24 (2001)
19Shutting-off works also in rate model
Node
Time
20CANN (integrators) are stiff
21 and can drift and jump
Trappenberg, Dynamic cooperation and competition
in a network of spiking neurons ICONIP'98
22Neuroscience applications of CANNs
- Persistent activity (memory) and winner-takes-all
(competition) -
- Cortical network (e.g. Wilson Cowan,
Sampolinsky, Grossberg) - Working memory (e.g. Compte, Wang, Brunel, Amit
(?), etc) - Oculomotor programming (e.g. Kopecz Schoener,
Trappenberg et al.) - Attention (e.g. Sompolinsky, Olshausen, Salinas
Abbott (?), etc) - Population decoding (e.g. Wu et al, Pouget,
Zhang, Deneve, etc ) - SOM (e.g. Wilshaw van der Malsburg)
-
- Place and head direction cells (e.g. Zhang,
Redish, Touretzky, - Samsonovitch, McNaughton, Skaggs, Stringer et
al.) - Motor control (Stringer et al.)
b a s i c C A N N
P I
Path-integration
23Modified CANN solves path-integration
24CANNs can implement motor functions
Stringer, Rolls, Trappenberg, de Araujo,
Self-organizing continuous attractor networks
and motor functions Neural Networks 16 (2003).
25... learning motor sequences (e.g. speaking a
work)
Experiment 1
Movement selector cells motor
cells state cells
26 from noisy examples
Experiment 2
state cells learning from noisy examples
27 and reaching from different initial states
Experiment 3
Stringer, Rolls, Trappenberg, de
Araujo, Self-organizing continuous attractor
networks and motor function Neural Networks 16
(2003).
28Drift is caused by asymmetries
29CANN can support multiple packets
Stringer, Rolls Trappenberg, Self-organising
continuous attractor networks with multiple
Activity packets, and the representation of
space Neural Networks 17 (2004)
30How many activity packets can be stable?
Trappenberg, Why is our working memory capacity
so large? Neural Information Processing-Letters
and Reviews, Vol. 1 (2003)
31Stabilization can be too strong
Trappenberg Standage, Multi-packet regions in
stabilized continuous attractor networks,
submitted to CNS04
32Conclusion
- CANN are widespread in neuroscience models
(brain) - Short term memory, feature selectivity (WTA)
- Path-integration is an elegant mechanisms to
generate dynamic sequences (self-organized)
33With thanks to
- Cognitive Neuroscience, Oxford Univ.
- Edmund Rolls
- Simon Stringer
- Ivan Araujo
- Psychology, Dalhousie Univ.
- Ray Klein
- Physiology, Queens Univ.
- Doug Munoz
- Mike Dorris
- Computer Science, Dalhousie
- Dominic Standage
34CANN can discover dimensionality
35CANN with adaptive input strength explains
express saccades
36CANN are great for population decoding (fast
pattern matching implementation)
37John Lismans hippocampus
38The model equations
Continuous dynamic (leaky integrator)
activity of node i firing rate synaptic
efficacy matrix global inhibition visual
input time constant scaling factor
connections per node slope threshold
NMDA-style stabilization
Hebbian learning