References & Citations
Computer Science > Computation and Language
Title: Transformers in the Service of Description Logic-based Contexts
(Submitted on 15 Nov 2023 (v1), last revised 26 Apr 2024 (this version, v3))
Abstract: Recent advancements in transformer-based models have initiated research interests in investigating their ability to learn to perform reasoning tasks. However, most of the contexts used for this purpose are in practice very simple: generated from short (fragments of) first-order logic sentences with only a few logical operators and quantifiers. In this work, we construct the natural language dataset, DELTA$_D$, using the description logic language $\mathcal{ALCQ}$. DELTA$_D$ contains 384K examples, and increases in two dimensions: i) reasoning depth, and ii) linguistic complexity. In this way, we systematically investigate the reasoning ability of a supervised fine-tuned DeBERTa-based model and of two large language models (GPT-3.5, GPT-4) with few-shot prompting. Our results demonstrate that the DeBERTa-based model can master the reasoning task and that the performance of GPTs can improve significantly even when a small number of samples is provided (9 shots). We open-source our code and datasets.
Submission history
From: Angelos Poulis [view email][v1] Wed, 15 Nov 2023 13:23:24 GMT (880kb,D)
[v2] Mon, 26 Feb 2024 08:40:13 GMT (880kb,D)
[v3] Fri, 26 Apr 2024 16:32:02 GMT (636kb,D)
Link back to: arXiv, form interface, contact.