JMLR

On Non-asymptotic Theory of Recurrent Neural Networks in Temporal Point Processes

Authors
Zhiheng Chen Guanhua Fang Wen Yu
Research Topics
Machine Learning Time Series
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Sep 08, 2025
Abstract

Temporal point process (TPP) is an important tool for modeling and predicting irregularly timed events across various domains. Recently, the recurrent neural network (RNN)-based TPPs have shown practical advantages over traditional parametric TPP models. However, in the current literature, it remains nascent in understanding neural TPPs from theoretical viewpoints. In this paper, we establish the excess risk bounds of RNN-TPPs under many well-known TPP settings. We especially show that an RNN-TPP with no more than four layers can achieve vanishing generalization errors. Our technical contributions include the characterization of the complexity of the multi-layer RNN class, the construction of $\tanh$ neural networks for approximating dynamic event intensity functions, and the truncation technique for alleviating the issue of unbounded event sequences. Our results bridge the gap between TPP's application and neural network theory.

Author Details
Zhiheng Chen
Author
Guanhua Fang
Author
Wen Yu
Author
Research Topics & Keywords
Machine Learning
Research Area
Time Series
Research Area
Citation Information
APA Format
Zhiheng Chen , Guanhua Fang & Wen Yu . On Non-asymptotic Theory of Recurrent Neural Networks in Temporal Point Processes. Journal of Machine Learning Research .
BibTeX Format
@article{paper501,
  title = { On Non-asymptotic Theory of Recurrent Neural Networks in Temporal Point Processes },
  author = { Zhiheng Chen and Guanhua Fang and Wen Yu },
  journal = { Journal of Machine Learning Research },
  url = { https://www.jmlr.org/papers/v26/24-1953.html }
}