CS621: Artificial Intelligence Lecture 17: Feedforward network (lecture 16 was on Adaptive Hypermedia: Debraj, Kekin and Raunak) - PowerPoint PPT Presentation

About This Presentation
Title:

CS621: Artificial Intelligence Lecture 17: Feedforward network (lecture 16 was on Adaptive Hypermedia: Debraj, Kekin and Raunak)

Description:

XY' X'Y. A popular universal data for testing learning algorithms: IRIS Data. Sepal Length ... Ex: Degree - 2 surfaces like parabola. Use layered network ... – PowerPoint PPT presentation

Number of Views:93
Avg rating:3.0/5.0
Slides: 17
Provided by: saur1
Category:

less

Transcript and Presenter's Notes

Title: CS621: Artificial Intelligence Lecture 17: Feedforward network (lecture 16 was on Adaptive Hypermedia: Debraj, Kekin and Raunak)


1
CS621 Artificial IntelligenceLecture 17
Feedforward network(lecture 16 was on Adaptive
Hypermedia Debraj, Kekin and Raunak)
  • Pushpak Bhattacharyya
  • Computer Science and Engineering Department
  • IIT Bombay

2
Machine Learning Basics
  • Learning from examples
  • e1,e2,e3 are ve examples
  • f1, f2, f3 are ve examples

3
Classification of Learning Paradigms
Learning
Statistical
Knowledge Based
Learning From Analogy
Learning From Examples
Neural Networks
Learning From Examples
Decision Trees
4
Example Loan Reliability Detection (Feature
Vector)
  • Features for deciding if a person is reliable for
    granting loan
  • Age Numerical
  • Gender categorical
  • Education categorical
  • Salary numerical
  • Family background categorical
  • Loan history categorical

5
Kolmogorov Theorem 1965
  • (Informal Statement) A function (Yes/No) can be
    computed by a 3-layer n/w of simple Yes/No
    computing elements

6
3 Layer NN for XOR
XYXY
T 0.5
1
1
XY
XY
T 0.5
T 0.5
-1
-1
1
1
Y
X
7
A popular universal data for testing learning
algorithms IRIS Data
Sepal Length Sepal Width Petal Length Petal Width Classes
5.1 3.5 1.4 0.2 setosa
4.9 3.0 1.4 0.2 setosa
6.3 2.9 5.6 1.8 virginica
6.9 3.1 4.9 1.5 versicolor
5.5 2.3 4.0 1.3 versicolor
5.7 2.8 4.1 1.3 versicolor
6.3 3.3 6.0 2.5 virginica
5.7 2.8 4.1 1.3 versicolor
6.3 3.3 6.0 2.5 virginica
8
Machine Learning Basics cont..
  • Training arrive at hypothesis h based on the
    data seen.
  • Testing present new data to h test performance.

hypothesis
h
concept
c
9
Feedforward Network
10
Limitations of perceptron
  • Non-linear separability is all pervading
  • Single perceptron does not have enough computing
    power
  • Eg XOR cannot be computed by perceptron

11
Solutions
  • Tolerate error (Ex pocket algorithm used by
    connectionist expert systems).
  • Try to get the best possible hyperplane using
    only perceptrons
  • Use higher dimension surfaces
  • Ex Degree - 2 surfaces like parabola
  • Use layered network

12
Pocket Algorithm
  • Algorithm evolved in 1985 essentially uses PTA
  • Basic Idea
  • Always preserve the best weight obtained so far
    in the pocket
  • Change weights, if found better (i.e. changed
    weights result in reduced error).

13
XOR using 2 layers
  • Non-LS function expressed as a linearly
    separable
  • function of individual linearly separable
    functions.

14
Example - XOR
  • 0.5

? Calculation of XOR
w21
w11
x1x2
x1x2
x1 x2 x1x2
0 0 0
0 1 1
1 0 0
1 1 0
Calculation of
x1x2
  • 1

w21.5
w1-1
x2
x1
15
Example - XOR
  • 0.5

w21
w11
x1x2
1
1
x1x2
1.5
-1
-1
1.5
x2
x1
16
Some Terminology
  • A multilayer feedforward neural network has
  • Input layer
  • Output layer
  • Hidden layer (assists computation)
  • Output units and hidden units are called
  • computation units.
Write a Comment
User Comments (0)
About PowerShow.com