Tuning Accuracy-Diversity Trade-off in Neural Network Ensemble via Novel Entropy Loss Function

Ali M. A., Şahin Y. H., Özögür-Akyüz S., Ünal G., Otar B. C.

13th International Conference on Electrical and Electronics Engineering, ELECO 2021, Virtual, Bursa, Turkey, 25 - 27 November 2021, pp.365-368 identifier

  • Publication Type: Conference Paper / Full Text
  • Doi Number: 10.23919/eleco54474.2021.9677845
  • City: Virtual, Bursa
  • Country: Turkey
  • Page Numbers: pp.365-368
  • Istanbul Technical University Affiliated: Yes


© 2021 Chamber of Turkish Electrical Engineers.Ensemble methods are used in machine learning by combining several models which produce an optimal predictive model. Neural network ensemble learning is a technique, which uses multiple individual deep neural networks (DNNs). Ensemble pruning methods are used to reduce the computational complexity of ensemble models. In this study, a novel optimization model is proposed to increase error independence in classifiers via entropy measurement and thus better prune the ensemble. An ensemble of 300 DNNs is trained and tested on the CIFAR-10 dataset and results show an increase in accuracy while main-taining a level of relative entropy measured by Kullback-Leibler divergence (KL-divergence).