Cross-Lingual Transfer in Zero-Shot Cross-Language Entity Linking

Elliot Schumacher, James Mayfield, Mark Dredze

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Cross-language entity linking grounds mentions written in several languages to a monolingual knowledge base. We use a simple neural ranking architecture for this task that uses multilingual BERT representations of both the mention and the context as input, so as to explore the ability of a transformer model to perform well on this task. We find that the multilingual ability of BERT leads to good performance in monolingual and multilingual settings. Furthermore, we explore zero-shot language transfer and find surprisingly robust performance. We conduct several analyses to identify the sources of performance degradation in the zero-shot setting. Results indicate that while multilingual transformer models transfer well between languages, issues remain in disambiguating similar entities unseen in training.

Original languageEnglish (US)
Title of host publicationFindings of the Association for Computational Linguistics
Subtitle of host publicationACL-IJCNLP 2021
EditorsChengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
PublisherAssociation for Computational Linguistics (ACL)
Pages583-595
Number of pages13
ISBN (Electronic)9781954085541
DOIs
StatePublished - 2021
EventFindings of the Association for Computational Linguistics: ACL-IJCNLP 2021 - Virtual, Online
Duration: Aug 1 2021Aug 6 2021

Publication series

NameFindings of the Association for Computational Linguistics: ACL-IJCNLP 2021

Conference

ConferenceFindings of the Association for Computational Linguistics: ACL-IJCNLP 2021
CityVirtual, Online
Period8/1/218/6/21

ASJC Scopus subject areas

  • Language and Linguistics
  • Linguistics and Language

Fingerprint

Dive into the research topics of 'Cross-Lingual Transfer in Zero-Shot Cross-Language Entity Linking'. Together they form a unique fingerprint.

Cite this