Derivation%20of%20Recursive%20Least%20Squares - PowerPoint PPT Presentation

About This Presentation
Title:

Derivation%20of%20Recursive%20Least%20Squares

Description:

Given that is the collection. Thus the least squares solution is ... to bias the calculation of the Pn matrix giving more recent values greater ... – PowerPoint PPT presentation

Number of Views:228
Avg rating:3.0/5.0
Slides: 11
Provided by: hong159
Category:

less

Transcript and Presenter's Notes

Title: Derivation%20of%20Recursive%20Least%20Squares


1
Derivation of Recursive Least Squares Given that
is the collection Thus the
least squares solution is Now what happens
when we increase n by 1, when a new data point
comes in, we need to re-estimate this
requires repetitions calculations and
recalculating the inverse (expensive in computer
time and storage)
2
Lets look at the expression
and and define
3
(1)
(2)
The least squares estimate at data n
(3)
(4)
( Substitute (4) into (3) )
( Applying (1) )
4
RLS Equations are
But we still require a matrix inverse to be
calculated in (8)
Matrix Inversion Lemma If A, C, BCD are
nonsigular square matrix ( the inverse exists)
then
5
The best way to prove this is to multiply both
sides by ABCD Now, in (8), identify A, B,C,D
6
Matrix inversion lemma is very important in
convert LS into RLS. To prove the above,
7
RLS equations are
In practice, this recursive formula can be
initiated by setting to a large diagonal
matrix, and by letting be your best first
guess.
8
RLS with forgetting We would like to modify the
recursive least squares algorithm so that older
data has less effect on the coefficient
estimation. This could be done by biasing the
objective function that we are trying to minimise
(i.e. the squared error) This same weighting
function when used on an ARMAX model can be used
to bias the calculation of the Pn matrix giving
more recent values greater prominence, as
follows. where ? is chosen to be between 0
and 1.
9
When ? is 1 all time steps are of equal
importance but as ? smaller less emphasis is
given to older values. We can use this expression
to derive a recursive form of weighted
The Matrix inversion lemma will then give a
method of calculating given to
get
10
RLS Algorithm with forgetting factor
Write a Comment
User Comments (0)
About PowerShow.com