RLS Presentation
RLS Presentation
RLS Presentation
By
1.Gidey Leul
Recursive Least Square @EITM 2/4/2020
1
CONTENT
Adaptive Filters
Stab
ility
- --
eas IIR Adaptive Filters Recursive Filters
y
FIR Adaptive Filters
RLC-EITM 2/4/2020
RECURSIVE LEAST SQUARE
• Objective : maintaining a solution which is optimal at each iteration. Differentiation of the
error leads to the normal equation:
-Convergence
- Knowledge of Auto and Cross Correlation
-
• Produce same set filter coeffcient for all • 𝑀𝑖𝑛𝑖𝑚𝑖𝑧𝑖𝑛𝑔 𝑠𝑞𝑢𝑎𝑟𝑒𝑑 𝑒𝑟𝑟𝑜𝑟 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑡 𝑥(𝑛)
sequence
• 𝐹𝑜𝑟 𝑑𝑖𝑓𝑓𝑒𝑟𝑒𝑛𝑡 𝑠𝑖𝑔𝑛𝑎𝑙𝑠 𝑑𝑖𝑓𝑓𝑒𝑟𝑒𝑛𝑡 𝑓𝑖𝑙𝑡𝑒𝑟.
• 𝑤𝑛 𝑑𝑒𝑝𝑒𝑛𝑑𝑠 𝑜𝑛 𝑒𝑛𝑠𝑒𝑚𝑏𝑙𝑒 𝑎𝑣𝑒𝑟𝑎𝑔𝑒.
• 𝐷𝑖𝑓𝑓𝑒𝑟𝑒𝑛𝑡 𝑅𝑒𝑎𝑙𝑖𝑧𝑎𝑡𝑖𝑜𝑛 𝑜𝑓 𝑥 𝑛 𝑎𝑛𝑑 𝑑 𝑛
𝑤𝑖𝑙𝑙 𝑙𝑒𝑎𝑑 𝑡𝑜 𝑑𝑖𝑓𝑓𝑒𝑟𝑒𝑛𝑡 𝑠𝑜𝑙𝑢𝑡𝑖𝑜𝑛.
•
MINIMIZING WEIGHTED LEAST SQUARE ERROR
•
Recursive Least Square @EITM 2/4/2020 9
CONTND….
. RLS
• k = 0,1,2 … . p
𝑿 𝒊 = 𝒙 𝒊 ,𝒙 𝒊 − 𝟏 ,…..𝒙 𝒊 − 𝒑 𝑻
𝑹𝒙 𝒏 𝑾𝒏 = 𝒓𝒅𝒙(𝒏)
𝒓𝒅𝒙 𝒏 , 𝒅𝒆𝒕𝒆𝒓𝒓𝒎𝒊𝒏𝒊𝒔𝒕𝒊𝒄 𝒄𝒓𝒐𝒔𝒔𝒄𝒐𝒓𝒓𝒆𝒍𝒂𝒕𝒊𝒐𝒏 𝒅 𝒊 𝒂𝒏𝒅 𝒙∗ (i)
𝒓𝒅𝒙 𝒏 =σ𝒏𝒊=𝟎 𝝀𝒏−𝒊 𝒅 𝒊 𝒙∗ 𝒊 Let us evaluate
minimum mean square error:
.
Ƹ(n)=σ𝒏𝒊=𝟎 𝝀𝒏−𝒊 𝒆 𝒊 . 𝒆∗ 𝒊 = σ𝒏𝒊=𝟎 𝝀𝒏−𝒊 𝒆 𝒊 . {𝒅 𝒊 − σ𝒑𝒊=𝟎 𝑾𝒏 𝒍 𝑿(𝒊 − 𝒍)}^ ∗
• Rx 𝒏 −𝟏
=
𝑾𝒏 = 𝒑(𝒏)𝒓𝒅𝒙(𝒏)
• 𝑊 n = 𝒑(𝒏)𝜆rdx(n − 1) + 𝒑(𝒏) 𝑥 ∗ 𝑛 𝑑 𝑛 ,
..
14
RLS algorithm: Recursive updating vector wn & the inverse autocorrelation p(n).
Recursive Least Square @EITM 2/4/2020 15
SLIDING WINDOW RLS
• Minimizes Exponentially Weighted least square error
Ƹ(n)=σ𝒏𝒊=𝟎 𝝀𝒏−𝒊 𝒆 𝒏 𝟐
𝝀=𝟏
•
• Medical
R n R n1 x n x tn
gn gn1 d (n)x n
• We seek solutions of the form:
fn R -n1gn
fn 1 R -n11gn 1
• We can apply the matrix inversion lemma for
computation of the inverse:
zx f t
d (n 1)zx z t
fn1 fn d (n 1)z
n 1 n n 1
1 x z t
n 1 1 x tn1z
• Define the a priori error as:
e(n 1 / n) d (n 1) f x t
n n 1
reflecting that this is the error obtained using the
old filter and the new data.
Recursive Least Square @EITM 2/4/2020 23
COTND…
• Using this definition, we can rewrite the RLS algorithm update equation as:
1
fn1 fn e(n 1 / n)R n x n1
-1
1 x n1R n x n1
t -1
e(n 1 / n) d (n 1) f x n 1 t
n
.
α(n)
1 fn1 fn (n)e(n 1 / n)R x -1
n n 1
1 x tn 1R -1
n x n 1
R R (n)R x x R
-1
n 1
-1
n
-1 t -1
n n 1 n 1 n
The RLS algorithm can be expected to converge more quickly because the use
of an aggressive, adaptive step size.
• RLS VS NEWTON METHOD
R -1
x x t
R -1
R n1 (R n x n1 x n1 ) R n
-1 t 1 -1 n n 1 n 1 n
1 x n1R n x n1
t -1
e( n ) d ( n ) f x n t
n
fn 1 fn R e(n) x n -1
n
1) Initialize f1 , R 1
-1
2) Iterate for n=1,2,3….
e(n / n 1) d (n 1) fnt x n 1 α(n)
1
x tnR -n11 x n
fn fn1 (n)e(n / n 1)R x
-1
n 1 n
1 -1 • RLS is computationally more complex than simple LMS because it is
R R n1 (n)R -n11 x n1 x tn1R -n11
-1
O(L2).
n
• In principle, convergence is independent of the eigenvalue structure of
Recursive Least Square @EITM
the signal due to the premultiplication by the inverse of the
autocorrelation matrix. 2/4/2020 27
APPLICATION OF RLS