...

[2406.04289] What Languages are Easy to Language-Model? A Perspective from Learning Probabilistic Regular Languages


View a PDF of the paper titled What Languages are Simple to Language-Mannequin? A Perspective from Studying Probabilistic Common Languages, by Nadav Borenstein and seven different authors

View PDF
HTML (experimental)

Summary:What can massive language fashions study? By definition, language fashions (LM) are distributions over strings. Due to this fact, an intuitive manner of addressing the above query is to formalize it as a matter of learnability of courses of distributions over strings. Whereas prior work on this course targeted on assessing the theoretical limits, in distinction, we search to grasp the empirical learnability. Not like prior empirical work, we consider neural LMs on their residence turf-learning probabilistic languages-rather than as classifiers of formal languages. Particularly, we examine the learnability of standard LMs (RLMs) by RNN and Transformer LMs. We empirically check the learnability of RLMs as a operate of assorted complexity parameters of the RLM and the hidden state dimension of the neural LM. We discover that the RLM rank, which corresponds to the scale of linear area spanned by the logits of its conditional distributions, and the anticipated size of sampled strings are robust and vital predictors of learnability for each RNNs and Transformers. A number of different predictors additionally attain significance, however with differing patterns between RNNs and Transformers.

Submission historical past

From: Nadav Borenstein [view email]
[v1]
Thu, 6 Jun 2024 17:34:24 UTC (8,395 KB)
[v2]
Fri, 7 Jun 2024 08:30:02 UTC (8,366 KB)
[v3]
Mon, 10 Jun 2024 21:53:32 UTC (8,366 KB)
[v4]
Thu, 21 Nov 2024 14:27:01 UTC (8,366 KB)

Source link

#Languages #Simple #LanguageModel #Perspective #Studying #Probabilistic #Common #Languages