More than Accuracy: A Composite Learning Framework for Interval Type-2 Fuzzy Logic Systems

Beke A., Kumbasar T.

IEEE Transactions on Fuzzy Systems, vol.31, no.3, pp.734-744, 2023 (SCI-Expanded) identifier

  • Publication Type: Article / Article
  • Volume: 31 Issue: 3
  • Publication Date: 2023
  • Doi Number: 10.1109/tfuzz.2022.3188920
  • Journal Name: IEEE Transactions on Fuzzy Systems
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Aerospace Database, Applied Science & Technology Source, Communication Abstracts, Compendex, Computer & Applied Sciences, INSPEC, Metadex, zbMATH, Civil Engineering Abstracts
  • Page Numbers: pp.734-744
  • Keywords: Deep learning (DL), interval type-2 fuzzy logic systems (IT2-FLS), parameterization tricks, quantile regression (QR), uncertainty
  • Istanbul Technical University Affiliated: Yes


IEEEIn this paper, we propose a novel composite learning framework for Interval Type-2 (IT2) Fuzzy Logic Systems (FLSs) to train regression models with a high accuracy performance and capable of representing uncertainty. In this context, we identify three challenges (i) the uncertainty handling capability, (ii) the construction of the composite loss, and (iii) a learning algorithm that overcomes the training complexity while taking into account the definitions of IT2-FLSs. This paper presents a systematic solution to these problems by exploiting the type-reduced set of IT2-FLS via fusing quantile regression and Deep Learning (DL) with IT2-FLS. The uncertainty processing capability of IT2-FLS depends on employed center-of-sets calculation methods, while its representation capability is defined via the structure of its antecedent and consequent membership functions. Thus, we present various parametric IT2-FLSs and define the learnable parameters of all IT2-FLSs alongside their constraints to be satisfied during training. To construct the loss function, we define a multi-objective loss and then convert it into a constrained composite loss composed of the log-cosh loss for accuracy purposes and a tilted loss for uncertainty representation, which explicitly uses the type-reduced set. We also present a DL approach to train IT2-FLS via unconstrained optimizers. In this context, we present parameterization tricks for converting the constraint optimization problem of IT2-FLSs into an unconstrained one without violating the definitions of fuzzy sets. Finally, we provide comprehensive comparative results for hyperparameter sensitivity analysis and an inter/intra-model comparison on various benchmark datasets.