4th lecture: - PowerPoint PPT Presentation

1 / 13
About This Presentation
Title:

4th lecture:

Description:

Topics in Machine Learning. 2. Definition. motivated by the ... margin = minx |wtx| we know: |wtx| e, hence |(w/e)tx|1. w' 11/17/09. Topics in Machine Learning ... – PowerPoint PPT presentation

Number of Views:24
Avg rating:3.0/5.0
Slides: 14
Provided by: Barb403
Category:
Tags: 4th | lecture | minx

less

Transcript and Presenter's Notes

Title: 4th lecture:


1
Topics in Machine Learning
  • 4th lecture
  • Perceptron

2
Definition
  • motivated by the biological neuron

3
Definition
  • the basic model

weights
w1
threshold/bias
w2
b
H(wtx - b)
wtx
w3
activation
...
wn
H perceptron function Heaviside function
4
Definition
  • geometry linear separation boundary

w
b/w1
b/w2
5
Task
  • learn a binary classification fRn?0,1
  • given examples (x,y) in Rnx0,1,
    positive/negative examples
  • evaluation mean number of misclassifications on
    a test set

6
Some basics
  • we can simulate the bias by an on-neuron
  • H(wtx-b)H((w,-b)t(x,1)-0)
  • for any finite set, we can assume that no point
    lies on the boundary
  • we can assume that a solution classifies all
    points correctly with margin 1
  • margin minx wtx
  • we know wtx e,
  • hence (w/e)tx1

w
7
Perceptron learning algorithm
Rosenblatt, 1962
  • simulate the bias as on-neuron
  • define the error signal

init w repeat while some x with d(w,x)?0
exists w
w d(w,x)x
example ? blackboard
Hebbian learning
8
General
Hebbian learning (? Psychology,
D.O.Hebb) increase the connection strength for
similar signals and decrease the strength for
dissimilar signals
  • weight adaptation for the perceptron learning
    rule for misclassified examples

w w d(w,x)x
9
Perceptron convergence theorem
  • Theorem The perceptron algorithm converges after
    a finite number of steps if a solution exists.
  • Proof Assume w is a solution with wtx1 for
    all x. Denote by wk the weights in the kth step
    of the algorithm.
  • Show by induction
  • wtwk wtw0 k (scalar product with
    solution becomes larger)
  • wk2 w02 k max x2 (length is
    restricted) ? blackboard
  • Hence
  • wtw0 k wtwk wwk w (w02
    k maxx2)1/2

Cauchy-Schwartz
10
Perceptron convergence theorem
  • This yields two graphs

wtw0 k
w (w02 k maxx2)1/2
algorithm converged
k
11
Perceptron - theory
  • For a solvable training problem
  • the perceptron algorithm converges,
  • the number of steps can be exponential,
  • alternative formulation linear programming (find
    x which solves Axb) ? polynomial algorithms
    exist (Khachiyan/Karmakar algorithm in the mean,
    also the simplex method is good)
  • generalization ability scales with the input
    dimension (? learning theory, later session)
  • Only linearly separable problems can be solved
    with the perceptron ? linear classification
    boundary.

12
Perceptron - theory
  • Problems which are not linearly separable
  • e.g. XOR
  • the perceptron algorithm cannot find a solution,
    but a cycle will be observed (perceptron-cycle
    theorem, i.e. the same weight will be observed
    twice during the algorithm)
  • a solution as good as possible is found if the
    examples are chosen randomly after some time ?
    pocket algorithm store the best solution
  • finding an optimum solution in the presence of
    errors is NP-hard (can even not be approximated
    with respect to any given constant)

example ? blackboard
13
Perceptron - history
  • 43 McCulloch/Pitts propose artificial neurons
    and show the universal computation ability for
    circuits of neurons
  • 49 Hebb paradigm proposed
  • 58 Rosenblatt-perceptron ( perceptron fixed
    preprocessing with masks), learning algorithm,
    used for picture recognition
  • 60 Widrow/Hoff Adaline, company
    Memistor-Corporation
  • 69 Minsky/Papert show the restrictions of the
    Rosenblatt-perceptron with respect to its
    representational abilities
  • ? we need more powerful systems
  • Werbos Backpropagation, Vapnik Support Vector
    Machine
Write a Comment
User Comments (0)
About PowerShow.com