Electronic International Standard Serial Number (EISSN)
1573-0565
abstract
This paper analyzes the application of a particular class of Bregman divergencesto design cost-sensitive classifiers for multiclass problems. We show that these divergence measures can be used to estimate posterior probabilities with maximal accuracy for the probability values that are close to the decision boundaries. Asymptotically, the proposed divergence measures provide classifiers minimizing the sum of decision costs in non-separable problems, and maximizing a margin in separable MAP problems.
Classification
keywords
cost sensitive learning; bregman divergence; posterior class probabilities; maximum margin