JMLR

Scaling Data-Constrained Language Models

Authors
Niklas Muennighoff Alexander M. Rush Boaz Barak Teven Le Scao Aleksandra Piktus Nouamane Tazi Sampo Pyysalo Thomas Wolf Colin Raffel
Research Topics
Machine Learning
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Jul 15, 2025
Abstract

The current trend of scaling language models involves increasing both parameter count and training data set size. Extrapolating this trend suggests that training data set size may soon be limited by the amount of text data available on the internet. Motivated by this limit, we investigate scaling language models in data-constrained regimes. Specifically, we run a large set of experiments varying the extent of data repetition and compute budget, ranging up to 900 billion training tokens and 9 billion parameter models. We find that with constrained data for a fixed compute budget, training with up to 4 epochs of repeated data yields negligible changes to loss compared to having unique data. However, with more repetition, the value of adding compute eventually decays to zero. We propose and empirically validate a scaling law for compute optimality that accounts for the decreasing value of repeated tokens and excess parameters. Finally, we experiment with approaches mitigating data scarcity, including augmenting the training data set with code data or removing commonly used filters. Models and data sets from our 400 training runs are freely available at https://github.com/huggingface/datablations.

Author Details
Niklas Muennighoff
Author
Alexander M. Rush
Author
Boaz Barak
Author
Teven Le Scao
Author
Aleksandra Piktus
Author
Nouamane Tazi
Author
Sampo Pyysalo
Author
Thomas Wolf
Author
Colin Raffel
Author
Research Topics & Keywords
Machine Learning
Research Area
Citation Information
APA Format
Niklas Muennighoff , Alexander M. Rush , Boaz Barak , Teven Le Scao , Aleksandra Piktus , Nouamane Tazi , Sampo Pyysalo , Thomas Wolf & Colin Raffel . Scaling Data-Constrained Language Models. Journal of Machine Learning Research .
BibTeX Format
@article{JMLR:v26:24-1000,
  author  = {Niklas Muennighoff and Alexander M. Rush and Boaz Barak and Teven Le Scao and Aleksandra Piktus and Nouamane Tazi and Sampo Pyysalo and Thomas Wolf and Colin Raffel},
  title   = {Scaling Data-Constrained Language Models},
  journal = {Journal of Machine Learning Research},
  year    = {2025},
  volume  = {26},
  number  = {53},
  pages   = {1--66},
  url     = {http://jmlr.org/papers/v26/24-1000.html}
}
Related Papers