Title: On Bubbles and Drifts: Continuous attractor networks in brain models
1On Bubbles and DriftsContinuous attractor
networks in brain models
- Thomas Trappenberg
- Dalhousie University, Canada
2Once upon a time ... (my CANN shortlist)
- Wilson Cowan (1973)
- Grossberg (1973)
- Amari (1977)
-
- Sampolinsky Hansel (1996)
- Zhang (1997)
-
- Stringer et al (2002)
3Its just a Hopfield net
Recurrent architecture
Synaptic weights
4In mathematical terms
Updating network states (network dynamics)
Gain function
Weight kernel
5Weights describe the effective interaction
profile in Superior Colliculus
TT, Dorris, Klein Munoz, J. Cog. Neuro. 13
(2001)
6Network can form bubbles of persistent activity
(in Oxford English activity packets)
7Space is represented with activity packets in the
hippocampal system
From Samsonovich McNaughton Path integration
and cognitive mapping in a continuous attractor
neural J. Neurosci. 17 (1997)
8There are phase transitions in the
weight-parameter space
9CANNs work with spiking neurons
Xiao-Jing Wang, Trends in Neurosci. 24 (2001)
10Shutting-off works also in rate model
Node
Time
11Various gain functions are used
End states
12CANNs can be trained with Hebb
Hebb
Training pattern
13Normalization is important to have convergent
method
- Random initial states
- Weight normalization
w(x,y)
w(x,50)
x
x
y
Training time
14Gradient-decent learning is also possible (Kechen
Zhang)
Gradient decent with regularization Hebb
weight decay
15CANNs have a continuum of point attractors
Point attractors and basin of attraction
Line of point attractors
Can be mixed Rolls, Stringer, Trappenberg A
unified model of spatial and episodic
memory Proceedings B of the Royal Society
2691087-1093 (2002)
16Neuroscience applications of CANNs
- Persistent activity (memory) and winner-takes-all
(competition) - Working memory (e.g. Compte, Wang, Brunel etc)
- Place and head direction cells (e.g. Zhang,
Redish, Touretzky, - Samsonovitch, McNaughton, Skaggs, Stringer et
al.) - Attention (e.g. Olshausen, Salinas Abbot, etc)
- Population decoding (e.g. Wu et al, Pouget,
Zhang, Deneve, etc ) - Oculomotor programming (e.g. Kopecz Schoener,
Trappenberg) - etc
17Superior colliculus intergrates exogenous and
endogenous inputs
18Superior Colliculus is a CANN
TT, Dorris, Klein Munoz, J. Cog. Neuro. 13
(2001)
19CANN with adaptive input strength explains
express saccades
20CANN are great for population decoding (fast
pattern matching implementation)
21CANN (integrators) are stiff
22 and drift and jump
TT, ICONIP'98
23Modified CANN solves path-integration
24CANNs can learn dynamic motor primitives
Stringer, Rolls, TT, de Araujo, Neural Networks
16 (2003).
25Drift is caused by asymmetries
26CANN can support multiple packets
Stringer, Rolls TT, Neural Networks 17 (2004)
27How many activity packets can be stable?
T.T., Neural Information Processing-Letters and
Reviews, Vol. 1 (2003)
28Stabilization can be too strong
TT Standage, CNS04
29CANN can discover dimensionality
30The model equations
Continuous dynamic (leaky integrator)
activity of node i firing rate synaptic
efficacy matrix global inhibition visual
input time constant scaling factor
connections per node slope threshold
NMDA-style stabilization
Hebbian learning