JMLR

Nonlocal Techniques for the Analysis of Deep ReLU Neural Network Approximations

Authors
Cornelia Schneider Mario Ullrich Jan Vybíral
Research Topics
Machine Learning
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Mar 03, 2026
Abstract

In recent work concerned with the approximation and expressive powers of deep neural networks, Daubechies, DeVore, Foucart, Hanin, and Petrova introduced a system of piecewise linear functions, which can be easily reproduced by artificial neural networks with the ReLU activation function, and showed that it forms a Riesz basis of $L_2([0, 1])$. Their work was subsequently generalized to the multivariate setting by Schneider and Vybíral. In the work at hand, we show that this system serves as a Riesz basis also for Sobolev spaces $W^s([0,1]^d)$ and Barron classes ${\mathbb B}^s([0,1]^d)$ with smoothness $0\lt s\lt 1$. We apply this fact to re-prove some recent results on the approximation of functions from these classes by deep neural networks. Our proof method avoids using local approximations and also allows us to track the implicit constants as well as to show that we can avoid the curse of dimension. Moreover, we also study how well one can approximate Sobolev and Barron functions by neural networks if only function values are known.

Author Details
Cornelia Schneider
Author
Mario Ullrich
Author
Jan Vybíral
Author
Research Topics & Keywords
Machine Learning
Research Area
Citation Information
APA Format
Cornelia Schneider , Mario Ullrich & Jan Vybíral . Nonlocal Techniques for the Analysis of Deep ReLU Neural Network Approximations. Journal of Machine Learning Research .
BibTeX Format
@article{paper1003,
  title = { Nonlocal Techniques for the Analysis of Deep ReLU Neural Network Approximations },
  author = { Cornelia Schneider and Mario Ullrich and Jan Vybíral },
  journal = { Journal of Machine Learning Research },
  url = { https://www.jmlr.org/papers/v27/25-0746.html }
}