Learning Radial Basis Neural Networks in a Lazy Way: A Comparative Study Articles
Overview
published in
- NEUROCOMPUTING Journal
publication date
- May 2008
start page
- 2529
end page
- 2537
issue
- 13-15
volume
- 71
Digital Object Identifier (DOI)
International Standard Serial Number (ISSN)
- 0925-2312
Electronic International Standard Serial Number (EISSN)
- 1872-8286
abstract
- Lazy learning methods have been used to deal with problems in which the learning examples are not evenly distributed in the input space. They are based on the selection of a subset of training patterns when a new query is received. Usually, that selection is based on the k closest neighbors and it is a static selection, because the number of patterns selected does not depend on the input space region in which the new query is placed. In this paper, a lazy strategy is applied to train radial basis neural networks. That strategy incorporates a dynamic selection of patterns, and that selection is based on two different kernel functions, the Gaussian and the inverse function. This lazy learning method is compared with the classical lazy machine learning methods and with eagerly trained radial basis neural networks.