We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.CL

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Computation and Language

Title: ALMol: Aligned Language-Molecule Translation LLMs through Offline Preference Contrastive Optimisation

Abstract: The field of chemistry and Artificial Intelligence (AI) intersection is an area of active research that aims to accelerate scientific discovery. The integration of large language models (LLMs) with scientific modalities has shown significant promise in this endeavour. However, challenges persist in effectively addressing training efficacy and the out-of-distribution problem, particularly as existing approaches rely on larger models and datasets. In this context, we focus on machine language-molecule translation and deploy a novel training approach called contrastive preference optimisation, which avoids generating translations that are merely adequate but not perfect. To ensure generalisability and mitigate memorisation effects, we conduct experiments using only 10\% of the data. Our results demonstrate that our models achieve up to a 32\% improvement compared to counterpart models. We also introduce a scalable fine-grained evaluation methodology that accommodates responsibility.
Subjects: Computation and Language (cs.CL); Multimedia (cs.MM)
Cite as: arXiv:2405.08619 [cs.CL]
  (or arXiv:2405.08619v2 [cs.CL] for this version)

Submission history

From: Dimitris Gkoumas [view email]
[v1] Tue, 14 May 2024 13:59:24 GMT (1087kb,D)
[v2] Wed, 15 May 2024 09:08:40 GMT (1087kb,D)

Link back to: arXiv, form interface, contact.