JMLR

Local Linear Recovery Guarantee of Deep Neural Networks at Overparameterization

Authors
Yaoyu Zhang Leyang Zhang Zhongwang Zhang Zhiwei Bai
Research Topics
Machine Learning
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Jul 15, 2025
Abstract

Determining whether deep neural network (DNN) models can reliably recover target functions at overparameterization is a critical yet complex issue in the theory of deep learning. To advance understanding in this area, we introduce a concept we term “local linear recovery” (LLR), a weaker form of target function recovery that renders the problem more amenable to theoretical analysis. In the sense of LLR, we prove that functions expressible by narrower DNNs are guaranteed to be recoverable from fewer samples than model parameters. Specifically, we establish upper limits on the optimistic sample sizes, defined as the smallest sample size necessary to guarantee LLR, for functions in the space of a given DNN. Furthermore, we prove that these upper bounds are achieved in the case of two-layer tanh neural networks. Our research lays a solid groundwork for future investigations into the recovery capabilities of DNNs in overparameterized scenarios.

Author Details
Yaoyu Zhang
Author
Leyang Zhang
Author
Zhongwang Zhang
Author
Zhiwei Bai
Author
Research Topics & Keywords
Machine Learning
Research Area
Citation Information
APA Format
Yaoyu Zhang , Leyang Zhang , Zhongwang Zhang & Zhiwei Bai . Local Linear Recovery Guarantee of Deep Neural Networks at Overparameterization. Journal of Machine Learning Research .
BibTeX Format
@article{JMLR:v26:24-0192,
  author  = {Yaoyu Zhang and Leyang Zhang and Zhongwang Zhang and Zhiwei Bai},
  title   = {Local Linear Recovery Guarantee of Deep Neural Networks at Overparameterization},
  journal = {Journal of Machine Learning Research},
  year    = {2025},
  volume  = {26},
  number  = {69},
  pages   = {1--30},
  url     = {http://jmlr.org/papers/v26/24-0192.html}
}
Related Papers