INFO331 Machine learning. Neural networks. Supervised learning in neural networks.MLP and BP - PowerPoint PPT Presentation

View by Category
About This Presentation
Title:

INFO331 Machine learning. Neural networks. Supervised learning in neural networks.MLP and BP

Description:

... Kasabov Foundations of Neural Networks, Fuzzy Systems, ... in neural networks ... The neural network changes its connection weights during training. ... – PowerPoint PPT presentation

Number of Views:261
Avg rating:3.0/5.0
Slides: 19
Provided by: comme52
Learn more at: http://sclab.yonsei.ac.kr
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: INFO331 Machine learning. Neural networks. Supervised learning in neural networks.MLP and BP


1
INFO331Machine learning. Neural networks.
Supervised learning in neural networks.MLP and BP
  • (Text book section 2.11, pp.146-155 section
    3.7.3., pp.218-221) section 4.2,
    pp.267-282catch-up reading pp.251-266)

2
Machine learning
  • Issues in machine learning
  • Learning from static versus learning from dynamic
    data
  • Incremental learning
  • On-line learning, adaptive learning
  • Life-long learning
  • Cognitive learning processes in humans

3
Inductive learning
  • learning from examples
  • Inductive decision trees and the ID3 algorithm
  • Information gain evaluation

4
Other methods of machine learning
  • Learning by doing
  • Learning from advice
  • Learning by analogy
  • Case-based learning and reasoning
  • Template-based learning (Kasabov and Clarke) -
    Iris example

5
Learning fuzzy rules from data
  • Cluster-based methods
  • Fuzzy template -based method (Kasabov, 96),
    pp.218-219
  • Wangs method (pp.220-221)
  • Advantages and disadvantages

6
Supervised learning in neural networks
  • Supervised learning in neural networks
  • Perceptrons
  • Multilayer perceptrons (MLP) and the
    backpropagation algorithm
  • MLP as universal approximators
  • Problems and features of the MPL

7
Supervised learning in neural networks
  • The learning principle is to provide the input
    values and the desired output values for each of
    the training examples.
  • The neural network changes its connection weights
    during training.
  • Calculate the error
  • training error - how well a NN has learned the
    data
  • test error - how well a trained NN generalises
    over new input data.

8
Perceptrons
  • fig.4.8

9
Perceptrons
  • fig.4.9

10
Perceptrons
  • fig.4.10

11
MLP and the backpropagation algorithm
  • fig.4.11

12
MLP and the backpropagation algorithm
  • fig.4.12

13
MLP and the backpropagation algorithm
  • fig.4.13

14
MLPs as statistical tools
  • A MLP with one hidden layer can approximate any
    continuous function to any desired accuracy
    (Hornik et al, 1989)
  • MLP are multivariate non-linear regression models
  • MLP can learn conditional probabilities

15
Problems and features of the MPL
  • How to chose the number of the hidden nodes
  • Catastrophic forgetting
  • Introducing hints in neural networks
  • Overfitting (overlearning)

16
Problems and features of the MPL
  • Catastrophic forgetting
  • fig. 4.14

17
Problems and features of the MPL
  • Introducing hints
  • fig.4.15

18
Problems and features of the MPL
  • Overfitting
  • fig. 4.16
About PowerShow.com