Current browse context:
cond-mat.dis-nn
Change to browse by:
References & Citations
Condensed Matter > Disordered Systems and Neural Networks
Title: Robustness of the Random Language Model
(Submitted on 26 Sep 2023 (v1), last revised 22 Mar 2024 (this version, v2))
Abstract: The Random Language Model (De Giuli 2019) is an ensemble of stochastic context-free grammars, quantifying the syntax of human and computer languages. The model suggests a simple picture of first language learning as a type of annealing in the vast space of potential languages. In its simplest formulation, it implies a single continuous transition to grammatical syntax, at which the symmetry among potential words and categories is spontaneously broken. Here this picture is scrutinized by considering its robustness against extensions of the original model, and trajectories through parameter space different from those originally considered. It is shown here that (i) the scenario is robust to explicit symmetry breaking, an inevitable component of learning in the real world; and (ii) the transition to grammatical syntax can be encountered by fixing the deep (hidden) structure while varying the surface (observable) properties. It is also argued that the transition becomes a sharp thermodynamic transition in an idealized limit. Moreover, comparison with human data on the clustering coefficient of syntax networks suggests that the observed transition is equivalent to that normally experienced by children at age 24 months. The results are discussed in light of theory of first-language acquisition in linguistics, and recent successes in machine learning.
Submission history
From: Eric De Giuli [view email][v1] Tue, 26 Sep 2023 13:14:35 GMT (469kb,D)
[v2] Fri, 22 Mar 2024 15:39:24 GMT (545kb,D)
Link back to: arXiv, form interface, contact.