Kernel adaptive filter

In signal processing, a kernel adaptive filter is a type of nonlinear adaptive filter.[1] An adaptive filter is a filter that adapts its transfer function to changes in signal properties over time by minimizing an error or loss function that characterizes how far the filter deviates from ideal behavior. The adaptation process is based on learning from a sequence of signal samples and is thus an online algorithm. A nonlinear adaptive filter is one in which the transfer function is nonlinear.

Kernel adaptive filters implement a nonlinear transfer function using kernel methods.[1] In these methods, the signal is mapped to a high-dimensional linear feature space and a nonlinear function is approximated as a sum over kernels, whose domain is the feature space. If this is done in a reproducing kernel Hilbert space, a kernel method can be a universal approximator for a nonlinear function. Kernel methods have the advantage of having convex loss functions, with no local minima, and of being only moderately complex to implement.

Because high-dimensional feature space is linear, kernel adaptive filters can be thought of as a generalization of linear adaptive filters. As with linear adaptive filters, there are two general approaches to adapting a filter: the least mean squares filter (LMS)[2] and the recursive least squares filter (RLS).[3] There is also an approach that employs a projection-based rationale, i.e., the Kernel Adaptive Projected Subgradient method, which can incorporate more general (possibly non-differentiable) loss functions.[4][5] Among these three general approaches a number of variants have been created, including: Naive Online regularized Risk Minimization Algorithm (NORMA), Quantized KLMS (QKLMS),[6] Approximate Linear Dependency KRLS (ALD-KRLS), Sliding-Window KRLS (SW-KRLS), Fixed-Budget KRLS (FB-KRLS), the KRLS Tracker (KRLS-T) algorithm,[7] the Quantized APSM,[4] etc. There is also variants that can treat complex data, like the Complex Kernel LMS,[8] the widely linear (or augmented) Complex Kernel LMS[9] and the complex Kernel APSM.[10]

Source code (in Matlab) for some of the aforementioned algorithms together with relative experiments on synthetic data can be downloaded from here.

References

  1. 1 2 Weifeng Liu, José C. Principe, Simon Haykin (March 2010). Kernel Adaptive Filtering: A Comprehensive Introduction (PDF). Wiley. pp. 12–20. ISBN 978-0-470-44753-6.
  2. Liu, Weifeng; Pokharel, P.P.; Principe, J.C. (2008-02-01). "The Kernel Least-Mean-Square Algorithm". IEEE Transactions on Signal Processing 56 (2): 543–554. doi:10.1109/TSP.2007.907881. ISSN 1053-587X.
  3. Engel, Y.; Mannor, S.; Meir, R. (2004-08-01). "The kernel recursive least-squares algorithm". IEEE Transactions on Signal Processing 52 (8): 2275–2285. doi:10.1109/TSP.2004.830985. ISSN 1053-587X.
  4. 1 2 Academic Press Library in Signal Processing. CHAPTER 17 Online Learning in Reproducing Kernel Hilbert Spaces- Konstantinos Slavakis, Pantelis Bouboulis and Sergios Theodoridis. ISBN 978-0-12-397226-2.
  5. Slavakis, K.; Bouboulis, P.; Theodoridis, S. (2012-02-01). "Adaptive Multiregression in Reproducing Kernel Hilbert Spaces: The Multiaccess MIMO Channel Case". IEEE Transactions on Neural Networks and Learning Systems 23 (2): 260–276. doi:10.1109/TNNLS.2011.2178321. ISSN 2162-237X.
  6. Chen, Badong; Zhao, Songlin; Zhu, Pingping; Principe, J.C. (2012-01-01). "Quantized Kernel Least Mean Square Algorithm". IEEE Transactions on Neural Networks and Learning Systems 23 (1): 22–32. doi:10.1109/TNNLS.2011.2178446. ISSN 2162-237X.
  7. Steven Van Vaerenbergh and Ignacio Santamar´ıa. "A COMPARATIVE STUDY OF KERNEL ADAPTIVE FILTERING ALGORITHMS" (PDF). University of Cantabria. Retrieved 20 March 2014.
  8. Bouboulis, P.; Theodoridis, S. (2011-03-01). "Extension of Wirtinger's Calculus to Reproducing Kernel Hilbert Spaces and the Complex Kernel LMS" (PDF). IEEE Transactions on Signal Processing 59 (3): 964–978. doi:10.1109/TSP.2010.2096420. ISSN 1053-587X.
  9. Bouboulis, P.; Theodoridis, S.; Mavroforakis, M. (2012-09-01). "The Augmented Complex Kernel LMS". IEEE Transactions on Signal Processing 60 (9): 4962–4967. doi:10.1109/TSP.2012.2200479. ISSN 1053-587X.
  10. Bouboulis, P.; Slavakis, K.; Theodoridis, S. (2012-03-01). "Adaptive Learning in Complex Reproducing Kernel Hilbert Spaces Employing Wirtinger's Subgradients". IEEE Transactions on Neural Networks and Learning Systems 23 (3): 425–438. doi:10.1109/TNNLS.2011.2179810. ISSN 2162-237X.
This article is issued from Wikipedia - version of the Friday, January 01, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.