On building ensembles of stacked denoising auto-encoding classifiers and their further improvement Articles uri icon

publication date

  • January 2018

start page

  • 41

end page

  • 52


  • 39

International Standard Serial Number (ISSN)

  • 1566-2535

Electronic International Standard Serial Number (EISSN)

  • 1872-6305


  • To aggregate diverse learners and to train deep architectures are the two principal avenues towards increasing the expressive capabilities of neural networks. Therefore, their combinations merit attention. In this contribution, we study how to apply some conventional diversity methods-bagging and label switching- to a general deep machine, the stacked denoising auto-encoding classifier, in order to solve a number of appropriately selected image recognition problems. The main conclusion of our work is that binarizing multi-class problems is the key to obtain benefit from those diversity methods. Additionally, we check that adding other kinds of performance improvement procedures, such as pre-emphasizing training samples and elastic distortion mechanisms, further increases the quality of the results. In particular, an appropriate combination of all the above methods leads us to reach a new absolute record in classifying MNIST handwritten digits. These facts reveal that there are clear opportunities for designing more powerful classifiers by means of combining different improvement techniques. (C) 2017 Elsevier B.V. All rights reserved.


  • augmentation; classification; deep; diversity; learning; pre-emphasis