Continual Learning with Sparse Progressive Neural Networks Seyrek Ilerlemeli Sinir Aglari ile Surekli Ogrenme


Ergün E., Töreyin B. U.

28th Signal Processing and Communications Applications Conference, SIU 2020, Gaziantep, Türkiye, 5 - 07 Ekim 2020 identifier

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Cilt numarası:
  • Doi Numarası: 10.1109/siu49456.2020.9302115
  • Basıldığı Şehir: Gaziantep
  • Basıldığı Ülke: Türkiye
  • İstanbul Teknik Üniversitesi Adresli: Evet

Özet

© 2020 IEEE.Human brain effectively integrates prior knowledge to new skills by transferring experience across tasks without suffering from catastrophic forgetting. In this study, to continuously learn a visual classication task sequence (PermutedMNIST), we employed a neural network model with lateral connections, sparse group Least Absolute Shrinkage And Selection Operator (LASSO) regularization and projection regularization to decrease feature redundancy. We show that encouraging feature novelty on progressive neural networks (PNN) prevents major performance decrease on sparsication, sparsication of a progressive neural network produces fair results and decreases the number of learned task-specic parameters on novel tasks.