References & Citations
Computer Science > Computation and Language
Title: No Train but Gain: Language Arithmetic for training-free Language Adapters enhancement
(Submitted on 24 Apr 2024)
Abstract: Modular deep learning is the state-of-the-art solution for lifting the curse of multilinguality, preventing the impact of negative interference and enabling cross-lingual performance in Multilingual Pre-trained Language Models. However, a trade-off of this approach is the reduction in positive transfer learning from closely related languages. In response, we introduce a novel method called language arithmetic, which enables training-free post-processing to address this limitation. Inspired by the task arithmetic framework, we apply learning via addition to the language adapters, transitioning the framework from a multi-task to a multilingual setup. The effectiveness of the proposed solution is demonstrated on three downstream tasks in a MAD-X-based set of cross-lingual schemes, acting as a post-processing procedure. Language arithmetic consistently improves the baselines with significant gains in the most challenging cases of zero-shot and low-resource applications. Our code and models are available at this https URL .
Submission history
From: Mateusz Klimaszewski [view email][v1] Wed, 24 Apr 2024 08:52:40 GMT (920kb,D)
Link back to: arXiv, form interface, contact.