Feature Combiners With Gate-Generated Weights for Classification Articles uri icon

authors

  • OMARI, ADIL
  • FIGUEIRAS VIDAL, ANIBAL RAMON

publication date

  • January 2013

start page

  • 158

end page

  • 163

issue

  • 1

volume

  • 24

International Standard Serial Number (ISSN)

  • 2162-237X

Electronic International Standard Serial Number (EISSN)

  • 2162-2388

abstract

  • Using functional weights in a conventional linear combination architecture is a way of obtaining expressive power and represents an alternative to classical trainable and implicit nonlinear transformations. In this brief, we explore this way of constructing binary classifiers, taking advantage of the possibility of generating functional weights by means of a gate with fixed radial basis functions. This particular form of the gate permits training the machine directly with maximal margin algorithms. We call the resulting scheme "feature combiners with gate generated weights for classification." Experimental results show that these architectures outperform support vector machines (SVMs) and Real AdaBoost ensembles in most considered benchmark examples. An increase in the computational design effort due to cross-validation demands is the price to be paid to obtain this advantage. Nevertheless, the operational effort is usually lower than that needed by SVMs.

keywords

  • learning (artificial intelligence); pattern classification; radial basis function networks; support vector machines