This paves the way for more data-dependent generalization guarantees in dependent-data settings.
02.05.2025 18:35 β π 1 π 0 π¬ 0 π 0@erfunmirzaei.bsky.social
Researcher @PontilGroup.bsky.social| Ph.D. Student @ellis.eu, @Polytechnique, and @UniGenova. Interested in (deep) learning theory and others.
This paves the way for more data-dependent generalization guarantees in dependent-data settings.
02.05.2025 18:35 β π 1 π 0 π¬ 0 π 0Technique highlights:
πΉ Uses blocking methods
πΉ Captures fast-decaying correlations
πΉ Results in tight O(1/n) bounds when decorrelation is fast
Applications:
π Covariance operator estimation
π Learning transfer operators for stochastic processes
Our contribution:
We propose empirical Bernstein-type concentration bounds for Hilbert space-valued random variables arising from mixing processes.
π§ Works for both stationary and non-stationary sequences.
Challenge:
Standard i.i.d. assumptions fail in many learning tasks, especially those involving trajectory data (e.g., molecular dynamics, climate models).
π Temporal dependence and slow mixing make it hard to get sharp generalization bounds.
π¨ Poster at #AISTATS2025 tomorrow!
πPoster Session 1 #125
We present a new empirical Bernstein inequality for Hilbert space-valued random processesβrelevant for dependent, even non-stationary data.
w/ Andreas Maurer, @vladimir-slk.bsky.social & M. Pontil
π Paper: openreview.net/forum?id=a0E...
1/ π Over the past two years, our team, CSML, at IIT, has made significant strides in the data-driven modeling of dynamical systems. Curious about how we use advanced operator-based techniques to tackle real-world challenges? Letβs dive in! π§΅π
15.01.2025 14:34 β π 5 π 3 π¬ 1 π 0An inspiring dive into understanding dynamical processes through 'The Operator Way.' A fascinating approach made accessible for everyoneβcheck it out! ππ
15.01.2025 10:31 β π 4 π 1 π¬ 0 π 0Excited to present
"Unlocking State-Tracking in Linear RNNs Through Negative Eigenvalues"
at the M3L workshop at #NeurIPS
https://buff.ly/3BlcD4y
If interested, you can attend the presentation the 14th at 15:00, pass at the afternoon poster session, or DM me to discuss :)
In his book βThe Nature of Statistical Learningβ V. Vapnik wrote:
βWhen solving a given problem, try to avoid a more general problem as an intermediate stepβ
Excited to share our lab's amazing contributions at NeurIPS this year! Check out our papers and stay inspired! ππ #NeurIPS2024
10.12.2024 06:18 β π 3 π 0 π¬ 0 π 0Could add me to the list?
04.12.2024 22:29 β π 0 π 0 π¬ 0 π 0Hi Gaspard. I wonder what you are currently working on in regard to sequence models and world models. Since I have similar interests as you, and in the lab, we had worked on the intersection of the topics (bsky.app/profile/marc...).
27.11.2024 14:43 β π 2 π 0 π¬ 1 π 0Hi π We're glad to be here on @bsky.app and looking forward to engaging in this community. But first, learn a little more about us...
#ELLISforEurope #AI #ML #CrossBorderCollab #PhD