References & Citations
Computer Science > Computation and Language
Title: ADELT: Transpilation Between Deep Learning Frameworks
(Submitted on 7 Mar 2023 (v1), last revised 22 Apr 2024 (this version, v2))
Abstract: We propose the Adversarial DEep Learning Transpiler (ADELT), a novel approach to source-to-source transpilation between deep learning frameworks. ADELT uniquely decouples code skeleton transpilation and API keyword mapping. For code skeleton transpilation, it uses few-shot prompting on large language models (LLMs), while for API keyword mapping, it uses contextual embeddings from a code-specific BERT. These embeddings are trained in a domain-adversarial setup to generate a keyword translation dictionary. ADELT is trained on an unlabeled web-crawled deep learning corpus, without relying on any hand-crafted rules or parallel data. It outperforms state-of-the-art transpilers, improving pass@1 rate by 17.4 pts and 15.0 pts for PyTorch-Keras and PyTorch-MXNet transpilation pairs respectively. We provide open access to our code at this https URL
Submission history
From: Linyuan Gong [view email][v1] Tue, 7 Mar 2023 01:57:10 GMT (234kb,D)
[v2] Mon, 22 Apr 2024 18:18:15 GMT (289kb,D)
Link back to: arXiv, form interface, contact.