Skip to yearly menu bar Skip to main content


Poster

Robust System Identification: Finite-sample Guarantees and Connection to Regularization

Hank Park · Grani A. Hanasusanto · Yingying Li

Hall 3 + Hall 2B #348
[ ]
Sat 26 Apr midnight PDT — 2:30 a.m. PDT

Abstract: We consider the problem of learning nonlinear dynamical systems from a single sample trajectory. While the least squares estimate (LSE) is commonly used for this task, it suffers from poor identification errors when the sample size is small or the model fails to capture the system's true dynamics. To overcome these limitations, we propose a robust LSE framework, which incorporates robust optimization techniques, and prove that it is equivalent to regularizing LSE using general Schatten p-norms. We provide non-asymptotic performance guarantees for linear systems, achieving an error rate of O~(1/T), and show that it avoids the curse of dimensionality, unlike state-of-the-art Wasserstein robust optimization models. Empirical results demonstrate substantial improvements in real-world system identification and online control tasks, outperforming existing methods.

Live content is unavailable. Log in and register to view live content