Electronic International Standard Serial Number (EISSN)
2162-2388
abstract
Progressively emphasizing samples that are difficult to classify correctly is the base for the recognized high performance of real Adaboost (RA) ensembles. The corresponding emphasis function can be written as a product of a factor that measures the quadratic error and a factor related to the proximity to the classification border; this fact opens the door to explore the potential advantages provided by using adjustable combined forms of these factors. In this paper, we introduce a principled procedure to select the combination parameter each time a new learner is added to the ensemble, just by maximizing the associated edge parameter, calling the resulting method the dynamically adapted weighted emphasis RA (DW-RA). A number of application examples illustrates the performance improvements obtained by DW-RA.