© 2021 Chamber of Turkish Electrical Engineers.Ensemble methods are used in machine learning by combining several models which produce an optimal predictive model. Neural network ensemble learning is a technique, which uses multiple individual deep neural networks (DNNs). Ensemble pruning methods are used to reduce the computational complexity of ensemble models. In this study, a novel optimization model is proposed to increase error independence in classifiers via entropy measurement and thus better prune the ensemble. An ensemble of 300 DNNs is trained and tested on the CIFAR-10 dataset and results show an increase in accuracy while main-taining a level of relative entropy measured by Kullback-Leibler divergence (KL-divergence).