Lms algorithm FOR NON-STATIONARY INPUTS FOR THE PIPELINED IMPLEMENTATION OF ADAPTIVE ANTENNAS - PowerPoint PPT Presentation

1 / 9
About This Presentation
Title:

Lms algorithm FOR NON-STATIONARY INPUTS FOR THE PIPELINED IMPLEMENTATION OF ADAPTIVE ANTENNAS

Description:

Lms algorithm FOR NON-STATIONARY INPUTS FOR THE PIPELINED IMPLEMENTATION OF ADAPTIVE ANTENNAS ... E-mail: arunacha_at_cae.wisc.edu. ECE-734 Spring 2002. ABSTRACT ... – PowerPoint PPT presentation

Number of Views:75
Avg rating:3.0/5.0
Slides: 10
Provided by: aidedeng
Category:

less

Transcript and Presenter's Notes

Title: Lms algorithm FOR NON-STATIONARY INPUTS FOR THE PIPELINED IMPLEMENTATION OF ADAPTIVE ANTENNAS


1
  • Lms algorithm FOR NON-STATIONARY INPUTS FOR THE
    PIPELINED IMPLEMENTATION OF ADAPTIVE
    ANTENNAS
  •  
  • Prof.Yu Hen Hu Arjun Arunachalam
  • Department of Electrical and Computer
    Engineering
  • University Of Wisconsin-Madison
  • E-mail arunacha_at_cae.wisc.edu
  • ECE-734 Spring 2002

2
ABSTRACT
  •  
  • In this project, a modified LMS algorithm has
    been proposed which is capable of handling
    non-stationary inputs. The algorithm is then
    modified for the pipelined implementation of
    adaptive antennas which have higher throughput .A
    number of recently proposed Approximation
    techniques when applied to standard LMS algorithm
    can result in instability due to a change in the
    functionality of the algorithm and thus careful
    calibration of step size and depth of pipelining
    is needed. This project also studies the issues
    involved with the various approximations
    techniques used to reduce the overhead involved
    when the look-ahead technique is applied to a
    standard LMS algorithm.

3
Structure of an Adaptive Filter
  • U(N) Y(N)

  • D(N)

Transversal Filter With weights W (n)
Adaptive Weight Control Mechanism
?
4
Equations governing the Behavior of a pipelined
Adaptive Filter
  • Pipelined LMS
  • Computation
  • For n0,1,2,3n
  • E (n)D (n)-W (n-M) hU (n)
  • W (n1)W (n-M) ? ?E (n-I) U (n-I) This is a
    look ahead only if the error and input are wide
    sense stationary else this cannot be look-ahead.
  • Note This approximation Applicable for
    Stationary Inputs only
  • Different LMS has been proposed for
    non-stationary inputs in the report.

5
Approximation Technique for Stationary LMS
  • Replace the summation
  • ??E (n-I) U (n-I) I1,2M
  • With the Expression
  • ? M E (n)U (n).
  • This Stabilizes the pipelined implementation of
    an adaptive filter.approximation techniques such
    as RLA technique change the functionality of the
    LMS algorithm which may cause extreme instability
    necessitating the need for proper selection of
    step size and pipeline depth etc.

6
  • The aim of the simulations above was
  • 1.To show that the approximations introduced were
    just as effective as RLA if not better in
    pipelining an adaptive LMS filter.
  • 2.To show that without the additional hardware
    overhead, we can still obtain the same accuracy
    of an RLA approximation and also ensure a greater
    range of pipeline depths.
  • 3.I did not want to make drastic changes in the
    functionality of the algorithm and yet obtain
    significant pipelining depths.
  •  
  • Results
  • 1.I achieved the same accuracy of an RLA
    approximation and in some cases the accuracy is
    better.
  • 2.The depth of pipeline can be more for this
    approximation
  • 3.We can see that for a pipeline depth of 20,
    instability seems to be creeping in but the
    algorithm still is very accurate.
  • 4.The additional hardware overhead is not
    involved and the change is functionality is not
    as drastic as in RLA.
  •  
  •  

7
Non Stationary LMS-Changes introducedto the
original algorithm
  • 1.Forgetting Factor
  • 2.Epoch Length for staggered Update of the
    weights.

8
  • Results and analysis
  •  the aim of the NON-LMS simulations was
  •  1.To pipeline a non-stationary input LMS
    algorithm and then test its stability
  •  2.To change the epoch length and test for
    various pipeline depths.
  •   Results
  •  The non-stationary algorithm looks a lot more
    stable at greater pipeline depths than the
    stationary algorithm. Also, for greater pipeline
    depths for a fixed epoch length, the algorithm
    takes a long time to converge. Therefore, we can
    vary the epoch length or the pipeline depth in
    order to ensure faster convergence. Therefore, a
    balance between the throughput and convergence
    time should be theyre depending on the type of
    application for which we use the filter.
  • When we reduce the epoch length, the incoming
    data is broken up into smaller blocks and takes a
    longer time to converge. This can be seen by
    comparing the plots of the learning curve for
    pipelining depth of 5 but with epoch lengths of
    10 and 7 respectively. Therefore, we need to
    reduce the pipeline depth in order to reduce the
    convergence time. Therefore, for a pipeline depth
    of 3 as shown above, the convergence time is much
    lower. The conclusion is that depending on the
    application we need to strike a balance between
    the depth of pipelining and the epoch length.

9
Conclusion and Discussion
  • These courses provided me with the knowledge base
    to pursue this work.
  • I believe a lot more work can be done with
    adaptive filters to ensure faster computation for
    a variety of different applications. For example,
    changing epoch lengths for different can ensure
    faster computation for a much higher depth of
    pipeline. I believe that the work that needs to
    be done with adaptive filters needs to be more
    application specific.
  •  This project is an implementation type project
    undertaken by one person.
  • ARJUN ARUNACHALAM
  •  
Write a Comment
User Comments (0)
About PowerShow.com