The Impact of Pre-trained Language Models on Turkish Semantic Role Labelling Ön eǧitimli Dil Modellerinin Türkçenin Anlamsal Görev Çözümlemesine Etkisi

Oral E., Eryiğit G.

30th Signal Processing and Communications Applications Conference, SIU 2022, Safranbolu, Turkey, 15 - 18 May 2022 identifier

  • Publication Type: Conference Paper / Full Text
  • Doi Number: 10.1109/siu55565.2022.9864933
  • City: Safranbolu
  • Country: Turkey
  • Keywords: language models, semantic role labeling
  • Istanbul Technical University Affiliated: Yes


© 2022 IEEE.Semantic role labeling (SRL) is the task of finding the argument structures of verbs in a sentence. Previous studies for Turkish SRL have focused mostly on syntactic features; context-oriented approaches have not been explored in this area yet. In this paper, we investigate the impact of pre-trained neural language models, which are strong in context representations, on the semantic role labeling task of Turkish. BERT, ConvBERT and ELECTRA language models are adapted to Turkish SRL with parameter tuning. We report a 10 percentage points improvement over the morphology focused results, which relies on gold-standard morphological tags and thus does not contain the errors propagated due to a previous morphological analysis layer. Since our model does not have any such dependencies, the performance increase will be even higher in the actual scenario.