Tuning Accuracy-Diversity Trade-off in Neural Network Ensemble via Novel Entropy Loss Function


Ali M. A., Şahin Y. H., Özögür-Akyüz S., Ünal G., Otar B. C.

13th International Conference on Electrical and Electronics Engineering, ELECO 2021, Virtual, Bursa, Türkiye, 25 - 27 Kasım 2021, ss.365-368 identifier

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Doi Numarası: 10.23919/eleco54474.2021.9677845
  • Basıldığı Şehir: Virtual, Bursa
  • Basıldığı Ülke: Türkiye
  • Sayfa Sayıları: ss.365-368
  • İstanbul Teknik Üniversitesi Adresli: Evet

Özet

© 2021 Chamber of Turkish Electrical Engineers.Ensemble methods are used in machine learning by combining several models which produce an optimal predictive model. Neural network ensemble learning is a technique, which uses multiple individual deep neural networks (DNNs). Ensemble pruning methods are used to reduce the computational complexity of ensemble models. In this study, a novel optimization model is proposed to increase error independence in classifiers via entropy measurement and thus better prune the ensemble. An ensemble of 300 DNNs is trained and tested on the CIFAR-10 dataset and results show an increase in accuracy while main-taining a level of relative entropy measured by Kullback-Leibler divergence (KL-divergence).