Information-Theoretic Linear Feature Extraction Based on Kernel Density Estimators: A Review Articles uri icon

publication date

  • November 2012

start page

  • 1180

end page

  • 1189

issue

  • 6

volume

  • 42

International Standard Serial Number (ISSN)

  • 1094-6977

Electronic International Standard Serial Number (EISSN)

  • 1558-2442

abstract

  • In this paper, we provide a unified study of the application of kernel density estimators to supervised linear feature extraction by means of criteria inspired by information and detection theory. We enrich this study by the incorporation of two novel criteria to the study, i.e., the mutual information and the likelihood ratio test, and perform both a theoretical and an experimental comparison between the new methods and other ones previously described in the literature. The impact of the bandwidth selection of the density estimator in the classification performance is discussed. Some theoretical results that bound classification performance as a function or mutual information are also compiled. A set of experiments on different real-world datasets allows us to perform an empirical comparison of the methods, in terms of both accuracy and computational complexity. We show the suitability of these methods to determine the dimension of the subspace that contains the discriminative information.