The Impact of Pre-trained Language Models on Turkish Semantic Role Labelling Ön eǧitimli Dil Modellerinin Türkçenin Anlamsal Görev Çözümlemesine Etkisi


Oral E., Eryiğit G.

30th Signal Processing and Communications Applications Conference, SIU 2022, Safranbolu, Türkiye, 15 - 18 Mayıs 2022 identifier

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Doi Numarası: 10.1109/siu55565.2022.9864933
  • Basıldığı Şehir: Safranbolu
  • Basıldığı Ülke: Türkiye
  • Anahtar Kelimeler: language models, semantic role labeling
  • İstanbul Teknik Üniversitesi Adresli: Evet

Özet

© 2022 IEEE.Semantic role labeling (SRL) is the task of finding the argument structures of verbs in a sentence. Previous studies for Turkish SRL have focused mostly on syntactic features; context-oriented approaches have not been explored in this area yet. In this paper, we investigate the impact of pre-trained neural language models, which are strong in context representations, on the semantic role labeling task of Turkish. BERT, ConvBERT and ELECTRA language models are adapted to Turkish SRL with parameter tuning. We report a 10 percentage points improvement over the morphology focused results, which relies on gold-standard morphological tags and thus does not contain the errors propagated due to a previous morphological analysis layer. Since our model does not have any such dependencies, the performance increase will be even higher in the actual scenario.