Energy and Entropy based Intelligence Metric for Performance Estimation in DNNs

Kartal B., Üstündağ B. B.

5th International Conference on Artificial Intelligence in Information and Communication, ICAIIC 2023, Virtual, Online, Indonesia, 20 - 23 February 2023, pp.468-473 identifier

  • Publication Type: Conference Paper / Full Text
  • Doi Number: 10.1109/icaiic57133.2023.10067093
  • City: Virtual, Online
  • Country: Indonesia
  • Page Numbers: pp.468-473
  • Keywords: entropy, hyper-parameter optimization, intelligence, performance, predictability
  • Istanbul Technical University Affiliated: Yes


Model selection and hyper-parameter tuning are highly dependent on problem-specific data characterization in machine learning. Conventional performance evaluation in machine learning methods widely refers to metrics such as accuracy, sensitivity, and error rates. Estimation of the appropriate hyper-parameter set for a machine learning application on initially uncharacterized data sets requires iterative hyperparameter updates besides the training and test cycles. In this paper, an intelligence factor is proposed for rapid evaluation of model consistency and update of the hyper-parameters. The proposed metric can be used as an alternative to or used together with the traditional post-training performance-oriented success measurement methods. An additional artificial neural network is used for the prediction of the performance of the first network in the proposed method. The second neural network uses variations of the intelligence factor acquired from the first or the main neural network. Living organisms tend to conform themselves for minimum energy dissipation in time and space. By having the inspiration of energy consumption and entropy variation of the organisms and the systems in nature, we have defined an intelligence factor depending on energy consumption and the change of entropy in artificial neural networks. Subsampled data is applied to the first neural network and variation of intelligence factor is used as input of the second neural network. The second neural network is trained with the accuracy of the first neural network for different input data set, model, and hyper-parameter combinations. The proposed intelligence factor-based method enables model and hyper-parameter compliance verification of the first neural network by using less than 10% of any new input data set with the accuracy higher than 80%.