Title: An Introduction to adaptive filtering
1An Introduction to adaptive filtering its
applications
- By
- Asst.Prof.Dr.Thamer M.Jamel
- Department of Electrical Engineering
- University of Technology
- Baghdad Iraq
2Introduction
- Linear filters
- the filter output is a linear function of the
filter input - Design methods
- The classical approach
- frequency-selective filters such as
- low pass / band pass / notch filters etc
- Optimal filter design
- Mostly based on minimizing the mean-square
value - of the error signal
3Wiener filter
- work of Wiener in 1942 and Kolmogorov in 1939
- it is based on a priori
- statistical information
- when such a priori
- information is not available,
- which is usually the case,
- it is not possible to design
- a Wiener filter in the first
- place.
4Adaptive filter
- the signal and/or noise characteristics are often
nonstationary and the statistical parameters
vary with time - An adaptive filter has an adaptation algorithm,
that is meant to monitor the environment and vary
the filter transfer function accordingly - based in the actual signals received, attempts to
find the optimum filter design
5Adaptive filter
- The basic operation now involves two processes
- 1. a filtering process, which produces an
output signal in response to a given input
signal. - 2. an adaptation process, which aims to adjust
the filter parameters (filter transfer function)
to the (possibly time-varying) environment - Often, the (average) square value of the
error signal is used as the optimization
criterion
6Adaptive filter
- Because of complexity of the optimizing
algorithms most adaptive filters are digital
filters that perform digital signal processing - When processing
- analog signals,
- the adaptive filter
- is then preceded
- by A/D and D/A
- convertors.
7Adaptive filter
- The generalization to adaptive IIR filters leads
to stability problems - Its common to use
- a FIR digital filter
- with adjustable
- coefficients
8Applications of Adaptive Filters Identification
- Used to provide a linear model of an unknown
plant - Applications
- System identification
9Applications of Adaptive Filters Inverse Modeling
- Used to provide an inverse model of an unknown
plant - Applications
- Equalization (communications channels)
10Applications of Adaptive Filters Prediction
- Used to provide a prediction of the present value
of a random signal - Applications
- Linear predictive coding
11Applications of Adaptive Filters Interference
Cancellation
- Used to cancel unknown interference from a
primary signal - Applications
- Echo / Noise cancellation
- hands-free carphone, aircraft headphones etc
12ExampleAcoustic Echo Cancellation
13(No Transcript)
14(No Transcript)
15LMS Algorithm
- Most popular adaptation algorithm is LMS
- Define cost function as mean-squared error
- Based on the method of steepest descent
- Move towards the minimum on the error
surface to get to minimum - gradient of the error surface estimated at
every iteration
16LMS Adaptive Algorithm
- Introduced by Widrow Hoff in 1959
- Simple, no matrices calculation involved in the
adaptation - In the family of stochastic gradient algorithms
- Approximation of the steepest descent method
- Based on the MMSE criterion.(Minimum Mean square
Error) - Adaptive process containing two input signals
- 1.) Filtering process, producing output signal.
- 2.) Desired signal (Training sequence)
- Adaptive process recursive adjustment of filter
tap weights
17LMS Algorithm Steps
- Filter output
- Estimation error
- Tap-weight adaptation
18Stability of LMS
- The LMS algorithm is convergent in the mean
square if and only if the step-size parameter
satisfy - Here ?max is the largest eigenvalue of the
correlation matrix of the input data - More practical test for stability is
- Larger values for step size
- Increases adaptation rate (faster adaptation)
- Increases residual mean-squared error
19STEEPEST DESCENT EXAMPLE
- Given the following function we need to obtain
the vector that would give us the absolute
minimum. - It is obvious that
- give us the minimum.
- (This figure is quadratic error function
(quadratic bowl) ) -
-
20STEEPEST DESCENT EXAMPLE
- We start by assuming (C1 5, C2 7)
- We select the constant . If it is too big,
we miss the minimum. If it is too small, it would
take us a lot of time to het the minimum. I would
select 0.1. - The gradient vector is
-
-
21STEEPEST DESCENT EXAMPLE
As we can see, the vector c1,c2 converges to
the value which would yield the function minimum
and the speed of this convergence depends on .
22LMS CONVERGENCE GRAPH
Example for the Unknown Channel of 2nd order
Desired Combination of taps
This graph illustrates the LMS algorithm. First
we start from guessing the TAP weights. Then we
start going in opposite the gradient vector, to
calculate the next taps, and so on, until we get
the MMSE, meaning the MSE is 0 or a very close
value to it.(In practice we can not get exactly
error of 0 because the noise is a random process,
we could only decrease the error below a desired
minimum)
23(No Transcript)
24(No Transcript)
25(No Transcript)
26(No Transcript)
27(No Transcript)
28(No Transcript)
29(No Transcript)
30(No Transcript)
31(No Transcript)
32(No Transcript)
33(No Transcript)
34Adaptive Array Antenna
SMART ANTENNAS
35Adaptive Array Antenna
36(No Transcript)
37- Applications are many
- Digital Communications (OFDM , MIMO , CDMA, and
RFID) - Channel Equalisation
- Adaptive noise cancellation
- Adaptive echo cancellation
- System identification
- Smart antenna systems
- Blind system equalisation
- And many, many others
38Adaptive Equalization
39Introduction
-
- Wireless communication is the most
interesting field of communication these days,
because it supports mobility (mobile users).
However, many applications of wireless comm. now
require high-speed communications
(high-data-rates).
40- What is the ISI
- Inter-symbol-interference, takes place when a
given transmitted symbol is distorted by other
transmitted symbols. - Cause of ISI
- ISI is imposed due to band-limiting effect of
practical channel, or also due to the multi-path
effects (delay spread).
41(No Transcript)
42(No Transcript)
43- Definition of the Equalizer
- the equalizer is a digital filter that provides
an approximate inverse of channel frequency
response. - Need of equalization
- is to mitigate the effects of ISI to decrease
the probability of error that occurs without
suppression of ISI, but this reduction of ISI
effects has to be balanced with prevention of
noise power enhancement.
44(No Transcript)
45Types of Equalization techniques
- Linear Equalization techniques
- which are simple to implement, but greatly
enhance noise power because they work by
inverting channel frequency response. - Non-Linear Equalization techniques
- which are more complex to implement, but have
much less noise enhancement than linear
equalizers.
46Equalization Techniques
Fig.3 Classification of equalizers
47- Linear equalizer with N-taps, and (N-1) delay
elements. - Go
48Table of various algorithms and their trade-offs
algorithm Multiplying-operations complexity convergence tracking
LMS Low slow poor
MMSE Very high fast good
RLS High fast good
Fast kalman Fairly Low fast good
RLS-DFE High fast good
49(No Transcript)
50(No Transcript)
51(No Transcript)
52(No Transcript)
53Adaptive noise cancellation
54Adaptive Filter Block Diagram
55The LMS Equation
- The Least Mean Squares Algorithm (LMS) updates
each coefficient on a sample-by-sample basis
based on the error e(n). - This equation minimises the power in the error
e(n).
56The Least Mean Squares Algorithm
- The value of µ (mu) is critical.
- If µ is too small, the filter reacts slowly.
- If µ is too large, the filter resolution is poor.
- The selected value of µ is a compromise.
57LMS Convergence Vs u
58Audio Noise Reduction
- A popular application of acoustic noise reduction
is for headsets for pilots. This uses two
microphones.
59The Simulink Model
60Setting the Step size (mu)
- The rate of convergence of the LMS Algorithm is
controlled by the Step size (mu). - This is the critical variable.
61Trace of Input to Model
62Trace of LMS Filter Output
- Output starts at
- zero and grows.
63Trace of LMS Filter Error
- Error contains
- the noise.
64Typical C6713 DSK Setup
USB to PC
to 5V
Headphones
Microphone
65Adaptive Echo Cancellation
66(No Transcript)
67(No Transcript)
68(No Transcript)
69(No Transcript)
70Acoustic Echo Canceller
71(No Transcript)
72(No Transcript)
73(No Transcript)
74(No Transcript)
75(No Transcript)
76(No Transcript)
77(No Transcript)
78(No Transcript)
79(No Transcript)
80New Trends in Adaptive Filtering
- Partial Updating Weights.
- Sub-band adaptive filtering.
- Adaptive Kalman filtering.
- Affine Projection Method.
- Time-Space adaptive processing.
- Non-Linear adaptive filtering-
- Neural Networks.
- The Volterra Series Algorithm .
- Genetic Fuzzy.
- Blind Adaptive Filtering.
81Thank You