Morphosyntactic probing of multilingual BERT models
Ács, Judit and Hamerlik, Endre and Schwartz, Roy and Smith, Noah A. and Kornai, András (2024) Morphosyntactic probing of multilingual BERT models. NATURAL LANGUAGE ENGINEERING, 30 (4). pp. 753-792. ISSN 1351-3249 10.1017/S1351324923000190
Text
Acs_753_33999207_ny.pdf Download (2MB) |
Abstract
We introduce an extensive dataset for multilingual probing of morphological information in language models (247 tasks across 42 languages from 10 families), each consisting of a sentence with a target word and a morphological tag as the desired label, derived from the Universal Dependencies treebanks. We find that pre-trained Transformer models (mBERT and XLM-RoBERTa) learn features that attain strong performance across these tasks. We then apply two methods to locate, for each probing task, where the disambiguating information resides in the input. The first is a new perturbation method that “masks” various parts of context; the second is the classical method of Shapley values. The most intriguing finding that emerges is a strong tendency for the preceding context to hold more information relevant to the prediction than the following context.
Item Type: | Article |
---|---|
Subjects: | Q Science > QA Mathematics and Computer Science > QA75 Electronic computers. Computer science / számítástechnika, számítógéptudomány |
Divisions: | Informatics Laboratory |
SWORD Depositor: | MTMT Injector |
Depositing User: | MTMT Injector |
Date Deposited: | 26 Sep 2024 05:48 |
Last Modified: | 26 Sep 2024 05:48 |
URI: | https://eprints.sztaki.hu/id/eprint/10786 |
Update Item |