2L490 Introduction 1 - PowerPoint PPT Presentation

About This Presentation
Title:

2L490 Introduction 1

Description:

2L490 Introduction 6. Brain versus computer ... 2L490 Introduction 11. Signaling. All nerve cells ... 2L490 Introduction 15. Summary of Neuron Firing Behavior ... – PowerPoint PPT presentation

Number of Views:133
Avg rating:3.0/5.0
Slides: 43
Provided by: Rudol
Category:

less

Transcript and Presenter's Notes

Title: 2L490 Introduction 1


1
Neural Networks
  • Course 2L490
  • Lecturer Rudolf Mak
  • E-mail r.h.mak_at_tue.nl
  • Course notes H.M.M. ten Eikelder
  • Neural Networks
  • Webpage Neurale Netwerken (2L490)

2
Todays topics
  • BNNs versus ANNs
  • computing power and future development
  • BNNs
  • quick overview
  • ANNs
  • correspondence (with BNNs)
  • neuron model
  • learning paradigms
  • models
  • applications

3
Neural Computing
  • Neuroscience
  • The objective is to understand the human brain
  • Biologically realistic models of neurons
  • Biologically realistic connection topologies
  • Neural networks
  • The objective is to develop computation methods
  • Highly simplified artificial neurons
  • Connection topologies that are aimed at
    computational effectiveness

4
Man versus Machine (hardware)
Numbers Human brain Von Neumann computer (a.d. 2005)
elements 1010 - 1012 neurons 107 - 108 transistors
connections / element 104 10
switching frequency 103 Hz 109 - 1010 Hz
energy / operation 10-16 Joule 10-6 Joule
power consumption 10 Watt 100 - 500 Watt
reliability of elements low reasonable
reliability of system high reasonable
5
Man versus Machine (information processing)
Features Human Brain Von Neumann computer
Data representation analog digital
Memory localization distributed localized
Control distributed localized
Processing parallel sequential
Skill acquisition learning programming
6
Brain versus computer
  • The following two slides have been taken from a
    paper by
  • Hans Moravec
  • When will computer hardware match the human brain

7
(No Transcript)
8
(No Transcript)
9
Types of neurons (Kandel et al)
  • Sensory neurons
  • Carry information for the purpose of perception
    and motor coordination
  • Motor neurons
  • Carry commands to control muscles and glands
  • Interneurons
  • Relay or projection
  • Long distance signaling
  • Local
  • Information processing

10
Biological Neuron
  • A neuron has four
  • main regions
  • Cell body (soma)
  • Dendrites
  • Axon
  • Presynaptic terminal
  • excitatory
  • inhibitory

11
Signaling
  • All nerve cells signal in the same way through a
  • combination of electrical and chemical processes
  • Input component produces graded local signals
  • Trigger component initiates action potential
  • Conductile component propagates action potential
  • Output component releases neurotransmitter
  • All signaling is unidirectional

12
Spike (width 0.2 5ms)
13
Pulse Trains
14
Some animations
For this topic we visit the web-site Neurobiology
Home Page, Blackwell Science Subtopics Channel
gating during action potential Propagation of
the Action Potential Neurotransmitter action
15
Summary of Neuron Firing Behavior
  • The behavior is binary, a neuron either fires or
    it does not
  • A neuron doesnt fire if the accumulated activity
    stays below threshold
  • If the activity is above threshold, a neuron
    fires (produces a spike)
  • The firing frequency increases with accumulated
    activity until max. firing frequency reached
  • The firing frequency is limited by the refractory
    period of about 1-10 ms

16
Organization of the Brain
Central nervous system
Interregional circuits
Taken from the Computational Brain by
Churchland and Sejnowski
Local circuits
Neurons
Dendritic trees
Neural microcircuits
Synapses
Molecules
17
Neural Network
18
ANNs as a Computational Model
We can distinguish between sequential and
parallel models
  • Sequential
  • Recursive functions
  • Church
  • Turing machine
  • Turing
  • Random Access Machine
  • Von Neumann
  • Parallel
  • P(arallel)RAM
  • Cellular automata
  • Von Neumann
  • Artificial Neural Nets
  • McCulloch/Pitts
  • Wiener

19
Advantages of ANNs
  • Efficient
  • Inherent massively parallel
  • Robust
  • Can deal with incomplete and/or noisy data
  • Fault-tolerant
  • Still works when part of the net fails
  • User-friendly
  • Learning instead of programming

20
Disadvantages of ANNs
  • Difficult to design
  • The are no clear design rules for arbitrary
    applications
  • Hard or impossible to train
  • Difficult to assess internal operation
  • It is difficult to find out whether, and if so
    what tasks are performed by different parts of
    the net
  • Unpredictable
  • It is difficult to estimate future network
    performance based on current (or training)
    behavior

21
BNN-ANN Correspondence
  • Nodes stand for the neuron body
  • Linear combiners model accumulation of synaptic
    stimuli
  • Nonlinear activation function models firing
    behavior
  • Connections stand for the dendrites and axons
  • Synapses are modeled by attaching weights to the
    connections
  • Positive weights for excitatory synapses
  • Negative weights for inhibitory synapses

22
Artificial Neuron
transfer function
linear combiner
23
Discrete asymmetric transfer
Heaviside step-function f(c, x) ( x gt c ? 1
0 )
Transfer functions are also called activation
or squashing functions
24
Discrete symmetric transfer
Sign function f(x) ( x gt 0 ? 1 -1 )
Used with bipolar state encoding
25
Continuous asymmetric transfer
f(z) 1 / (1 e-cz )
sigmoid function logistic function
26
Continuous symmetric transfer
f(z) (ecz - e-cz ) / (ecz e-cz )
tangens hyperbolicus
27
Piecewise-Linear Transfer
f(c, z) ( z lt c ? 1 ( z gt c ? 1 z / c )
)
28
Local transfer function
  • f(z) N (0, 1) (1 / sqrt (2?)) exp (-x2/2)

29
Probabilistic Neurons
  • Neurons are in one of two states
  • x 0 or x 1
  • The transfer function P(z) only determines the
    probability of finding the output node in a
    certain
  • state
  • y 1 with probability P(z)
  • y 0 with probability 1 - P(z)
  • Common choice for P(z) is
  • P(z) 1 / 1 exp (-z / T)

30
Specific neuron models
  • McCulloch-Pitts neuron
  • Discrete (0 - 1) inputs
  • Heaviside activation function
  • Only weights 1 (excitatory) and -1 (inhibitory)
  • Adaline (Widrow Hoff)
  • Continuous inputs
  • Identity (no) activation function
  • Continuous weights
  • x0 1, w0 is bias

31
Artificial Neural Networks
  • Layered net with
  • n input nodes
  • m output nodes
  • zero or more
  • hidden layers
  • (one shown)

32
ANN Models
  • Feedforward networks (FANN)
  • Single-layer perceptrons (SLP, SLFF) (Rosenblatt)
  • Multi-layer perceptrons (MLP, MLFF) (Rumelhart,
    )
  • Radial basis function networks (RBFN)
  • Functional link nets (FLN)
  • (Neo-)Cognitron (Fukushima)

33
ANN Models (continued)
  • Recurrent networks (RNN)
  • Hopfield networks (Hopfield, Amari)
  • Boltzmann machines (Hinton, Sejnowski)
  • Bidirectional associative memory (Kosko)
  • Competitive learning networks (CLN)
  • Simple competitive learning networks
  • Self-organizing feature maps (Kohonen)
  • Adaptive resonance theory (Grossberg)

34
Hebbs Postulate of Learning
  • Biological formulation
  • When an axon of cell A is near enough to
    excite a cell and repeatedly or persistently
    takes part in firing it, some growth process or
    metabolic change takes place in one or both cells
    such that As efficiency as one of the cells
    firing B is increased.
  • Mathematical formulation

35
Hebbs Postulate revisited
  • Stent (1973), and Changeux and Danchin (1976)
  • have expanded Hebbs rule such that it also mo-
  • dels inhibitory synapses
  • If two neurons on either side of a synapse are
    activated simultaneously (synchronously), then
    the strength of that synapse is selectively
    increased.
  • If two neurons on either side of a synapse are
    activated asynchronously, then that synapse is
    selectively weakened or eliminated.

36
Learning Methods
  • Supervised learning
  • Reinforcement learning
  • Corrective learning
  • Unsupervised learning
  • Competitive learning
  • Self-organizing learning
  • Off-line versus adaptive learning

37
Learning Tasks
  • Association
  • Classification
  • Clustering
  • Pattern recognition
  • Function approximation
  • Control
  • Adaptive filtering
  • Data compression
  • Prediction

38
Application areas (just a few)
  • Finance, Banking, Insurance
  • Loan approval, stock prediction, claim
    prediction, fraud detection
  • Business, Marketing
  • Sale prediction, customer profiling, data mining
  • Medicine
  • Diagnosis and treatment
  • Industry
  • Quality control
  • Machine/plant control
  • Telecommunication
  • Adaptive filtering (equalizing)
  • Speech recognition and synthesis

39
NN for setting target corn yields
40
(Optical) Character Recognition
41
Applications
vs
42
Robocup Four-Legged League
43
Brief history
  • Early stages
  • 1943 McCulloch-Pitts neuron as computing element
  • 1948 Wiener cybernatics
  • 1949 Hebb learning rule
  • 1958 Rosenblatt perceptron
  • 1960 Widrow-Hoff least mean square algorithm
  • Recession
  • 1969 Minsky-Papert limitations perceptron model
  • Revival
  • 1982 Hopfield recurrent network model
  • 1982 Kohonen self-organizing maps
  • 1986 Rumelhart et. al. backpropagation

44
Literature
  • The authoritative text on neural science is
  • Principles of Neural Science, fourth edition,
    eds. E.R. Kandel, J.H. Schwartz, T.M. Jessell,
  • Mc-Graw-Hill, 2000.
  • The authoritative text on neural networks is
  • Neural Networks a comprehensive foundation,
  • second edition, Simon Haykin, Prentice-Hall,
  • 1999.
Write a Comment
User Comments (0)
About PowerShow.com