Neural Networks I - PowerPoint PPT Presentation

1 / 28
About This Presentation
Title:

Neural Networks I

Description:

1986 David Rumehart, Geoffrey Hinton, Ronald Williams Backpropagation. 1969 Arthur Bryson & Paul Webos. 1974 Paul Werbos. 1985 David Parker ... – PowerPoint PPT presentation

Number of Views:51
Avg rating:3.0/5.0
Slides: 29
Provided by: ISD248
Category:
Tags: bam | eniac | mff | networks | neural | paul | williams

less

Transcript and Presenter's Notes

Title: Neural Networks I


1
Neural Networks I
  • Karel Berkovec
  • karel.berkovec (at) seznam.cz

Neural Networks I


Karel Berkovec, 2007
2
Artificial Intelligence
Expert systems, mathematical logic, production
systems, bayesian networks
  • Artificial Intelligence

Symbolic approach
Connectionist approach
Neural Networks
Adaptive approach
Stochastic methodes
Regression, interpolation, frequency analysis ..
Analytic approach
Neural Networks I


Karel Berkovec, 2007
3
Is it really working?
  • Is it a standard mechanism?
  • What is it good for?
  • Use it someone for real applications?
  • Can I grasp how it works?
  • Can I use it?

Neural Networks I


Karel Berkovec, 2007
4
This presentation
  • Basic introduction
  • Small history window
  • Model of neuron and neural network
  • Supervised learning (backpropagation)
  • No biology, mathematical fundaments, unsupervised
    learning, stochastic models, neurocomputers, etc.

Neural Networks I


Karel Berkovec, 2007
5
History I
  • 20s von Neumann computer model
  • 1943 Warren McCulloch and Walter Pitts
    matematical model of neuron
  • 1946 Eniac
  • 1949 Donald Hebb The Organization of
    Behaviour
  • 1951 1st Czechoslovak computer SAPO
  • 1951 1st neurocomputer Snark
  • 1957 Frank Rosenblatt perceptron learning
    algorithm
  • 1958 Rosenblatt and Charless Wightman 1st
    really used neurocomputer Mark I Perceptron

Neural Networks I


Karel Berkovec, 2007
6
History II
  • 60s ADALINE
  • 1st company oriented on neurocomputing
  • Exhausting of potential
  • 1967 Marvin Minsky Seymour Papert
    Perceptrons
  • XOR problem cant be solved by 1 perceptron

Neural Networks I


Karel Berkovec, 2007
7
History III
  • 1983 DARPA
  • 1982, 1984 - John Hopfield physical models
    NN
  • 1986 David Rumehart, Geoffrey Hinton, Ronald
    Williams Backpropagation
  • 1969 Arthur Bryson Paul Webos
  • 1974 Paul Werbos
  • 1985 David Parker
  • 1987 IEEE International Conference on Neural
    Networks
  • Since 90 NN boom of NNs
  • ART, BAM, RBF, spiking neurons

Neural Networks I


Karel Berkovec, 2007
8
Present
  • Many models of neuron Perceptron, RBF, spiking
    neuron
  • Many approaches backpropagation, hopfield
    learning, correlations, competitive learning,
    stochastic learning,
  • Many libraries and modules for Matlab,
    Statistica, Excel
  • Many applications forecasting, smoothing,
    recognition, classification, datamining,
    compression

Neural Networks I


Karel Berkovec, 2007
9
Pros and cons
  • Simple to use
  • Very good results
  • Fast results
  • Robust against incomplete or corrupted inputs
  • Generalization
  • /- Mathematical background
  • - Not transparent and traceable
  • - Hard to tune parameters (sometimes
    hair-triggered)
  • - Sometimes a long time for learning needed
  • - Some tasks are hard to formulate for NNs

Neural Networks I


Karel Berkovec, 2007
10
Formal neuron - perceptron

- potential
- threshold
- weights
Neural Networks I


Karel Berkovec, 2007
11
AB problem
Neural Networks I


Karel Berkovec, 2007
12
XOR problem
Neural Networks I


Karel Berkovec, 2007
13
XOR problem
1
1
Neural Networks I


Karel Berkovec, 2007
14
XOR problem
2
2
Neural Networks I


Karel Berkovec, 2007
15
XOR problem
AND
1
2
Neural Networks I


Karel Berkovec, 2007
16
Feed-forward layered network
Output layer 2nd hidden layer 1st hidden
layer Input layer
Neural Networks I


Karel Berkovec, 2007
17
Activating function
Heaviside function

Saturated linear function
Standard sigmoidal function
Hyperbolical tangents
Neural Networks I


Karel Berkovec, 2007
18
NN function
  • NN maps input on output
  • Feed-forward NN with one hidden layer and with
    sigmoidal activation function can approximate
    arbitrary closely any continuous function
  • The question is how to set up parameters of the
    network.

Neural Networks I


Karel Berkovec, 2007
19
NN learning
  • Error function
  • Perceptron adaptation rule
  • Algorithm with this learning rule convergates in
    finite time (if A and B separatable)

y0 d1
y1 d0
Neural Networks I


Karel Berkovec, 2007
20
AB problem
Neural Networks I


Karel Berkovec, 2007
21
Backpropagation
  • The most often used learning algorithm for NNs
    cca 80
  • Fast convergation
  • Good results
  • Many modifications

Neural Networks I


Karel Berkovec, 2007
22
Energetic function
  • How to adapt weights of neurons in hidden
    layers?
  • We would like to find a minimum of the error
    function
  • - why not use a derivation?

Neural Networks I


Karel Berkovec, 2007
23
Error gradient
  • Adaptation rule

Neural Networks I


Karel Berkovec, 2007
24
Output layer
Neural Networks I


Karel Berkovec, 2007
25
Hidden layer
Neural Networks I


Karel Berkovec, 2007
26
Implementation BP
  • initialize network
  • repeat
  • update weights
  • for all patterns
  • count the result
  • count error
  • count
  • until error is not small enough

Neural Networks I


Karel Berkovec, 2007
27
Improvements of BP
  • Momentum
  • Adaptive learning parameters
  • Other variants of BP SuperSAB, QuickProp,
    Levenberg-Marquart alg.

Neural Networks I


Karel Berkovec, 2007
28
Overfitting
Neural Networks I


Karel Berkovec, 2007
Write a Comment
User Comments (0)
About PowerShow.com