arXiv Analytics

Sign in

arXiv:2309.14913 [cond-mat.dis-nn]AbstractReferencesReviewsResources

Robustness of the Random Language Model

Fatemeh Lalegani, Eric De Giuli

Published 2023-09-26Version 1

The Random Language Model (De Giuli 2019) is an ensemble of stochastic context-free grammars, quantifying the syntax of human and computer languages. The model suggests a simple picture of first language learning as a type of annealing in the vast space of potential languages. In its simplest formulation, it implies a single continuous transition to grammatical syntax, at which the symmetry among potential words and categories is spontaneously broken. Here this picture is scrutinized by considering its robustness against explicit symmetry breaking, an inevitable component of learning in the real world. It is shown that the scenario is robust to such symmetry breaking. Comparison with human data on the clustering coefficient of syntax networks suggests that the observed transition is equivalent to that normally experienced by children at age 24 months.

Related articles: Most relevant | Search more
arXiv:1809.01201 [cond-mat.dis-nn] (Published 2018-09-04)
Random Language Model: a path to principled complexity
arXiv:1902.07516 [cond-mat.dis-nn] (Published 2019-02-20)
Emergence of order in random languages
arXiv:2505.06902 [cond-mat.dis-nn] (Published 2025-05-11)
Neuromodulation via Krotov-Hopfield Improves Accuracy and Robustness of RBMs