One-step Bayesian example-dependent cost classification: The OsC-MLP method Articles
Overview
published in
- NEURAL NETWORKS Journal
publication date
- May 2024
start page
- 1
end page
- 10
issue
- 106168
volume
- 173
Digital Object Identifier (DOI)
International Standard Serial Number (ISSN)
- 0893-6080
Electronic International Standard Serial Number (EISSN)
- 1879-2782
abstract
-
Example-dependent cost classification problems are those where the decision costs depend not only on the true and the attributed classes but also on the sample features. Discriminative algorithms that carry out such classification tasks must take this dependence into account. In some applications, the decision costs are known for the training set but not in production, which complicates the problem.
In this paper, we introduce a new one-step Bayesian formulation to train Neural Networks and solve the above limitation for binary cases with one-step Learning Machines, avoiding the drawbacks that unknown analytical forms of the example-dependent costs create. The formulation is based on defining an artificial likelihood ratio by using the available training classification costs in its definition, and proposes a test that does not require the values of the costs for unseen samples. Furthermore, it also includes Bayesian rebalancing mechanisms to combat the negative effects of class imbalance. Experimental results support the consistency and effectiveness of the corresponding algorithms.
Classification
subjects
- Telecommunications
keywords
- imbalance; bregman divergences; neural networks; informed re-balancing; sample emphasis