We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cond-mat.mtrl-sci

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Condensed Matter > Materials Science

Title: AtomGPT: Atomistic Generative Pre-trained Transformer for Forward and Inverse Materials Design

Abstract: Large language models (LLMs) such as generative pretrained transformers (GPTs) have shown potential for various commercial applications, but their applicability for materials design remains underexplored. In this article, we introduce AtomGPT, a model specifically developed for materials design based on transformer architectures, to demonstrate the capability for both atomistic property prediction and structure generation. We show that a combination of chemical and structural text descriptions can efficiently predict material properties with accuracy comparable to graph neural network models, including formation energies, electronic bandgaps from two different methods and superconducting transition temperatures. Furthermore, we demonstrate that AtomGPT can generate atomic structures for tasks such as designing new superconductors, with the predictions validated through density functional theory calculations. This work paves the way for leveraging LLMs in forward and inverse materials design, offering an efficient approach to the discovery and optimization of materials.
Subjects: Materials Science (cond-mat.mtrl-sci)
Cite as: arXiv:2405.03680 [cond-mat.mtrl-sci]
  (or arXiv:2405.03680v1 [cond-mat.mtrl-sci] for this version)

Submission history

From: Kamal Choudhary [view email]
[v1] Mon, 6 May 2024 17:54:54 GMT (361kb,D)

Link back to: arXiv, form interface, contact.