We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.LG

Change to browse by:

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Machine Learning

Title: Graph Diffusion Transformer for Multi-Conditional Molecular Generation

Abstract: Inverse molecular design with diffusion models holds great potential for advancements in material and drug discovery. Despite success in unconditional molecule generation, integrating multiple properties such as synthetic score and gas permeability as condition constraints into diffusion models remains unexplored. We present the Graph Diffusion Transformer (Graph DiT) for multi-conditional molecular generation. Graph DiT has a condition encoder to learn the representation of numerical and categorical properties and utilizes a Transformer-based graph denoiser to achieve molecular graph denoising under conditions. Unlike previous graph diffusion models that add noise separately on the atoms and bonds in the forward diffusion process, we propose a graph-dependent noise model for training Graph DiT, designed to accurately estimate graph-related noise in molecules. We extensively validate the Graph DiT for multi-conditional polymer and small molecule generation. Results demonstrate our superiority across metrics from distribution learning to condition control for molecular properties. A polymer inverse design task for gas separation with feedback from domain experts further demonstrates its practical utility.
Comments: 21 pages, 9 figures, 7 tables
Subjects: Machine Learning (cs.LG); Biomolecules (q-bio.BM)
Cite as: arXiv:2401.13858 [cs.LG]
  (or arXiv:2401.13858v2 [cs.LG] for this version)

Submission history

From: Gang Liu [view email]
[v1] Wed, 24 Jan 2024 23:45:31 GMT (10235kb,D)
[v2] Tue, 7 May 2024 01:51:26 GMT (10355kb,D)

Link back to: arXiv, form interface, contact.