Advertisement

Information Theoretic Learning

Renyi's Entropy and Kernel Perspectives

  • Jose C. Principe

Part of the Information Science and Statistics book series (ISS)

Table of contents

  1. Front Matter
    Pages i-xxii
  2. Dongxin Xu, Deniz Erdogmuns
    Pages 47-102
  3. Deniz Erdogmus, Seungju Han, Abhishek Singh
    Pages 141-179
  4. Deniz Erdogmus, Rodney Morejon, Weifeng Liu
    Pages 181-218
  5. Deniz Erdogmus, Dongxin Xu, Kenneth Hild II
    Pages 219-261
  6. Robert Jenssen, Sudhir Rao
    Pages 263-298
  7. Sudhir Rao, Deniz Erdogmus, Dongxin Xu, Kenneth Hild II
    Pages 299-349
  8. Jianwu Xu, Robert Jenssen, Antonio Paiva, Il Park
    Pages 351-384
  9. Weifeng Liu, Puskal Pokharel, Jianwu Xu, Sohan Seth
    Pages 385-413
  10. Puskal Pokharel, Ignacio Santamaria, Jianwu Xu, Kyu-hwa Jeong, Weifeng Liu
    Pages 415-455
  11. Back Matter
    Pages 457-515

About this book

Introduction

This book presents the first cohesive treatment of Information Theoretic Learning (ITL) algorithms to adapt linear or nonlinear learning machines both in supervised or unsupervised paradigms. ITL is a framework where the conventional concepts of second order statistics (covariance, L2 distances, correlation functions) are substituted by scalars and functions with information theoretic underpinnings, respectively entropy, mutual information and correntropy.

ITL quantifies the stochastic structure of the data beyond second order statistics for improved performance without using full-blown Bayesian approaches that require a much larger computational cost. This is possible because of a non-parametric estimator of Renyi’s quadratic entropy that is only a function of pairwise differences between samples. The book compares the performance of ITL algorithms with the second order counterparts in many engineering and machine learning applications.

Students, practitioners and researchers interested in statistical signal processing, computational intelligence, and machine learning will find in this book the theory to understand the basics, the algorithms to implement applications, and exciting but still unexplored leads that will provide fertile ground for future research.

José C. Principe is Distinguished Professor of Electrical and Biomedical Engineering, and BellSouth Professor at the University of Florida, and the Founder and Director of the Computational NeuroEngineering Laboratory. He is an IEEE and AIMBE Fellow, Past President of the International Neural Network Society, Past Editor-in-Chief of the IEEE Trans. on Biomedical Engineering and the Founder Editor-in-Chief of the IEEE Reviews on Biomedical Engineering. He has written an interactive electronic book on Neural Networks, a book on Brain Machine Interface Engineering and more recently a book on Kernel Adaptive Filtering, and was awarded the 2011 IEEE Neural Network Pioneer Award.

Keywords

Correntropy Information theoretic learning Nongaussian signal processing RKHS and information theory Robust adaptive filtering computational intelligence kernel learning machine learning

Authors and affiliations

  • Jose C. Principe
    • 1
  1. 1.Dept. Electrical Engineering &University of FloridaGainesvilleU.S.A.

Bibliographic information

  • DOI https://doi.org/10.1007/978-1-4419-1570-2
  • Copyright Information Springer-Verlag New York 2010
  • Publisher Name Springer, New York, NY
  • eBook Packages Computer Science
  • Print ISBN 978-1-4419-1569-6
  • Online ISBN 978-1-4419-1570-2
  • Series Print ISSN 1613-9011
  • Series Online ISSN 2197-4128
  • Buy this book on publisher's site