Nearest Prototype Classification of Noisy Data Articles uri icon

publication date

  • January 2009

start page

  • 53

end page

  • 66

issue

  • 1-4

volume

  • 30

international standard serial number (ISSN)

  • 0269-2821

electronic international standard serial number (EISSN)

  • 1573-7462

abstract

  • Nearest prototype approaches offer a common way to design classifiers. However, when data is noisy, the success of this sort of classifiers depends on some parameters that the designer needs to tune, as the number of prototypes. In this work, we have made a study of the ENPC technique, based on the nearest prototype approach, in noisy datasets. Previous experimentation of this algorithm had shown that it does not require any parameter tuning to obtain good solutions in problems where class limits are well defined, and data is not noisy. In this work, we show that the algorithm is able to obtain solutions with high classification success even when data is noisy. A comparison with optimal (hand made) solutions and other different classification algorithms demonstrates the good performance of the ENPC algorithm in accuracy and number of prototypes as the noise level increases. We have performed experiments in four different datasets, each of them with different characteristics.