On some Possibilities and Limitations of Statistical Learning Theory in the Stationary Ergodic Framework

Thursday 13 January 2022, 11:30 à 12:30

Exposé en ligne

Azadeh KHALEGHI

Abstract: In this talk I will introduce some nonparametric statistical learning techniques for sequential data in the presence of long-range dependencies. The relevant literature on this topic typically involves such parametric structural assumptions as autoregressive, moving average or Markovian models. However, the theoretical guarantees obtained under such standard modelling assumptions do not hold in the presence of statistical dependencies. We opt for a more powerful paradigm based on ergodic theory, viewing the observations as sample-paths of ergodic measure-preserving transformations. Ergodicity ensures the pointwise convergence of empirical measures to their true measures, giving rise to the key ingredient of our methodology. At the same time, assumptions such as independence and/or finite-memory can be relaxed, allowing for arbitrary statistical dependence structures. I will give an overview of some possibilities and limitations of the stationary ergodic framework in the context of some classical statistical learning problems such as Time-Series Clustering, Discernibility and (Restless) Multi-Armed Bandits.