In this talk, I will establish high-probability non-asymptotic upper bounds for the population excess risk of minimum $\ell_q$-norm interpolating estimators in linear regression for all $q\geq 1$, and minimum $\ell_2$-norm interpolating classifiers in linear classification. We obtain sufficient conditions for their benign overfitting behavior. Building upon non-exact oracle inequalities, we further introduce a new notion, which we refer to as non-exact benign overfitting, and establish sufficient conditions under which it arises.

Our results rely on a Features Space Decomposition (FSD) method where the self-regularization properties of minimum norm interpolant estimator is highlighted. Technically, we circumvent the convex min-max theorem, instead employing tools from Geometric Aspects of Functional Analysis - including the Dvoretzky-Milman theorem, Gluskin's theorem, and lower bounds on Gaussian mean widths of random polytopes.

This motivates probabilistic generalizations of many geometric functional analysis theorems beyond the Gaussian setting. We particularly emphasize that FSD method may potentially refine the uniform convergence approach, suggesting its promise as a new fundamental methodology in mathematical statistics.

This work is based on a collaboration with Guillaume Lecué.