We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

cs.CL

Change to browse by:

cs

References & Citations

DBLP - CS Bibliography

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Computer Science > Computation and Language

Title: The Impact of Symbolic Representations on In-context Learning for Few-shot Reasoning

Abstract: Pre-trained language models (LMs) have shown remarkable reasoning performance using explanations (or ``chain-of-thought'' (CoT)) for in-context learning. On the other hand, these reasoning tasks are usually presumed to be more approachable for symbolic programming. To make progress towards understanding in-context learning, we curate synthetic datasets containing equivalent (natural, symbolic) data pairs, where symbolic examples contain first-order logic rules and predicates from knowledge bases (KBs). Then we revisit neuro-symbolic approaches and use Language Models as Logic Programmer (LMLP) that learns from demonstrations containing logic rules and corresponding examples to iteratively reason over KBs, recovering Prolog's backward chaining algorithm. Comprehensive experiments are included to systematically compare LMLP with CoT in deductive reasoning settings, showing that LMLP enjoys more than 25% higher accuracy than CoT on length generalization benchmarks even with fewer parameters.
Comments: NeurIPS Neuro Causal and Symbolic AI Workshop, 2022
Subjects: Computation and Language (cs.CL)
Cite as: arXiv:2212.08686 [cs.CL]
  (or arXiv:2212.08686v1 [cs.CL] for this version)

Submission history

From: Hanlin Zhang [view email]
[v1] Fri, 16 Dec 2022 19:30:01 GMT (239kb,D)
[v2] Thu, 28 Mar 2024 08:20:12 GMT (255kb,D)

Link back to: arXiv, form interface, contact.