On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor functions - PowerPoint PPT Presentation

About This Presentation
Title:

On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor functions

Description:

Network can form bubbles of ... networks and motor functions Neural ... cooperation and competition in a network of spiking neurons ICONIP'98 ... – PowerPoint PPT presentation

Number of Views:288
Avg rating:3.0/5.0

less

Transcript and Presenter's Notes

Title: On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor functions


1
On Bubbles and DriftsContinuous attractor
networks and their relation to working memory,
path integration, population decoding, attention,
and motor functions
  • Thomas Trappenberg
  • Dalhousie University, Canada

2
CANNs can implement motor functions
Stringer, Rolls, Trappenberg, de Araujo,
Self-organizing continuous attractor networks
and motor functions Neural Networks 16 (2003).
3
My plans for this talk
  • Basic CANN model
  • Idiothetic CANN updates (path-integtration)
  • CANN motor functions
  • Limits on NMDA stabilization

4
Once upon a time ... (my CANN shortlist)
  • Wilson Cowan (1973)
  • Grossberg (1973)
  • Amari (1977)
  • Sampolinsky Hansel (1996)
  • Zhang (1997)
  • Stringer et al (2002)

Wilshaw van der Malsburg (1976)
Droulez Berthos (1988)
Redish, Touretzky, Skaggs, etc
5
Basic CANN Its just a Hopfield net
Recurrent architecture
Synaptic weights
Nodes can be scrambled!
6
In mathematical terms
Updating network states (network dynamics)
Gain function
Weight kernel
7
Network can form bubbles of persistent activity
(in Oxford English activity packets)
8
Space is represented with activity packets in the
hippocampal system
From Samsonovich McNaughton Path integration
and cognitive mapping in a continuous attractor
neural J. Neurosci. 17 (1997)
9
Various gain functions are used
End states
10
Superior colliculus intergrates exogenous and
endogenous inputs
11
Superior Colliculus is a CANN
Trappenberg, Dorris, Klein Munoz, A model of
saccade initiation based on the competitive
integration of exogenous and endogenous inputs
J. Cog. Neuro. 13 (2001)
12
Weights describe the effective interaction in
Superior Colliculus
Trappenberg, Dorris, Klein Munoz, A model of
saccade initiation based on the competitive
integration of exogenous and endogenous inputs
J. Cog. Neuro. 13 (2001)
13
There are phase transitions in the
weight-parameter space
14
CANNs can be trained with Hebb
Hebb
Training pattern
15
Normalization is important to have convergent
method
  • Random initial states
  • Weight normalization

w(x,y)
w(x,50)
x
x
y
Training time
16
Gradient-decent learning is also possible (Kechen
Zhang)
Gradient decent with regularization Hebb
weight decay
17
CANNs have a continuum of point attractors
Point attractors and basin of attraction
Line of point attractors
Can be mixed Rolls, Stringer, Trappenberg A
unified model of spatial and episodic
memory Proceedings B of the Royal Society
2691087-1093 (2002)
18
CANNs work with spiking neurons
Xiao-Jing Wang, Trends in Neurosci. 24 (2001)
19
Shutting-off works also in rate model
Node
Time
20
CANN (integrators) are stiff
21
and can drift and jump
Trappenberg, Dynamic cooperation and competition
in a network of spiking neurons ICONIP'98
22
Neuroscience applications of CANNs
  • Persistent activity (memory) and winner-takes-all
    (competition)
  • Cortical network (e.g. Wilson Cowan,
    Sampolinsky, Grossberg)
  • Working memory (e.g. Compte, Wang, Brunel, Amit
    (?), etc)
  • Oculomotor programming (e.g. Kopecz Schoener,
    Trappenberg et al.)
  • Attention (e.g. Sompolinsky, Olshausen, Salinas
    Abbott (?), etc)
  • Population decoding (e.g. Wu et al, Pouget,
    Zhang, Deneve, etc )
  • SOM (e.g. Wilshaw van der Malsburg)
  • Place and head direction cells (e.g. Zhang,
    Redish, Touretzky,
  • Samsonovitch, McNaughton, Skaggs, Stringer et
    al.)
  • Motor control (Stringer et al.)

b a s i c C A N N
P I
Path-integration
23
Modified CANN solves path-integration
24
CANNs can implement motor functions
Stringer, Rolls, Trappenberg, de Araujo,
Self-organizing continuous attractor networks
and motor functions Neural Networks 16 (2003).
25
... learning motor sequences (e.g. speaking a
work)
Experiment 1
Movement selector cells motor
cells state cells
26
from noisy examples
Experiment 2
state cells learning from noisy examples
27
and reaching from different initial states
Experiment 3
Stringer, Rolls, Trappenberg, de
Araujo, Self-organizing continuous attractor
networks and motor function Neural Networks 16
(2003).
28
Drift is caused by asymmetries
29
CANN can support multiple packets
Stringer, Rolls Trappenberg, Self-organising
continuous attractor networks with multiple
Activity packets, and the representation of
space Neural Networks 17 (2004)
30
How many activity packets can be stable?
Trappenberg, Why is our working memory capacity
so large? Neural Information Processing-Letters
and Reviews, Vol. 1 (2003)
31
Stabilization can be too strong
Trappenberg Standage, Multi-packet regions in
stabilized continuous attractor networks,
submitted to CNS04
32
Conclusion
  • CANN are widespread in neuroscience models
    (brain)
  • Short term memory, feature selectivity (WTA)
  • Path-integration is an elegant mechanisms to
    generate dynamic sequences (self-organized)

33
With thanks to
  • Cognitive Neuroscience, Oxford Univ.
  • Edmund Rolls
  • Simon Stringer
  • Ivan Araujo
  • Psychology, Dalhousie Univ.
  • Ray Klein
  • Physiology, Queens Univ.
  • Doug Munoz
  • Mike Dorris
  • Computer Science, Dalhousie
  • Dominic Standage

34
CANN can discover dimensionality
35
CANN with adaptive input strength explains
express saccades
36
CANN are great for population decoding (fast
pattern matching implementation)
37
John Lismans hippocampus
38
The model equations
Continuous dynamic (leaky integrator)
activity of node i firing rate synaptic
efficacy matrix global inhibition visual
input time constant scaling factor
connections per node slope threshold
NMDA-style stabilization
Hebbian learning
Write a Comment
User Comments (0)
About PowerShow.com