Finite-sample statistical guarantees for learning dynamical systems in state-space form

Jeudi 27 novembre 2025, 10:15 à 11:15

Salle de séminaires M.0.1

Mihaly Petreczky

CRIStAL, Univ. Lille

In this talk, I will present an overview of recent results on finite-sample Probably Approximately Correct (PAC) and PAC-Bayesian bounds for learning partially observed dynamical systems in state-space form. For clarity, we begin with linear stochastic systems in discrete time, learned from a single trajectory, and then discuss extensions to more complex nonlinear settings.

Learning linear systems is a classical topic in statistics and control theory (system identification), but most existing results provide only asymptotic consistency guarantees. In contrast, I focus on non-asymptotic bounds that quantify estimation and generalisation errors with finite data, even under stochastic noise and non-i.i.d. conditions. These results form a first step toward rigorous learning guarantees for nonlinear dynamical models, including deep state-space models and recurrent neural networks (RNNs).

I will also discuss extensions and challenges for deep state-space models (e.g., Mamba SSMs), and neural differential equations. In these latter settings, data often comes from multiple i.i.d. time series, yet the temporal structure of the models and data still presents significant analytical difficulties.