JMLR

Transformers Can Overcome the Curse of Dimensionality: A Theoretical Study from an Approximation Perspective

Authors
Yuling Jiao Yanming Lai Yang Wang Bokai Yan
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Mar 03, 2026
Abstract

The Transformer model is widely used in various application areas of machine learning, such as natural language processing. This paper investigates the approximation of the Hölder continuous function class $\mathcal{H}_{Q}^{\beta}\left([0,1]^{d\times n},\mathbb{R}^{d\times n}\right)$ by Transformers and constructs several Transformers that can overcome the curse of dimensionality. These Transformers consist of one self-attention layer with one head and the softmax function as the activation function, along with several feedforward layers. For example, to achieve an approximation accuracy of $\epsilon$, if the activation functions of the feedforward layers in the Transformer are ReLU and floor, only $\mathcal{O}\left(\log\frac{1}{\epsilon}\right)$ layers of feedforward layers are needed, with widths of these layers not exceeding $\mathcal{O}\left(\frac{1}{\epsilon^{2/\beta}}\log\frac{1}{\epsilon}\right)$. If other activation functions are allowed in the feedforward layers, the width of the feedforward layers can be further reduced to a constant. These results demonstrate that Transformers have a strong expressive capability. The construction in this paper is based on the Kolmogorov-Arnold Superposition Theorem and does not require the concept of contextual mapping, hence our proof is more intuitively clear compared to previous Transformer approximation works. Additionally, the translation technique proposed in this paper helps to apply the previous approximation results of feedforward neural networks to Transformer research.

Author Details
Yuling Jiao
Author
Yanming Lai
Author
Yang Wang
Author
Bokai Yan
Author
Citation Information
APA Format
Yuling Jiao , Yanming Lai , Yang Wang & Bokai Yan . Transformers Can Overcome the Curse of Dimensionality: A Theoretical Study from an Approximation Perspective. Journal of Machine Learning Research .
BibTeX Format
@article{paper964,
  title = { Transformers Can Overcome the Curse of Dimensionality: A Theoretical Study from an Approximation Perspective },
  author = { Yuling Jiao and Yanming Lai and Yang Wang and Bokai Yan },
  journal = { Journal of Machine Learning Research },
  url = { https://www.jmlr.org/papers/v27/25-1214.html }
}