Statistical Query Lower Bounds for Smoothed Agnostic Learning
We study the complexity of smoothed agnostic learning, recently introduced by~cite{CKKMS24}, in which the learner competes with the best classifier in a target class under slight Gaussian perturbations of the inputs. Specifically, we focus on the prototypical task of agnostically learning halfspaces under subgaussian distributions in the smoothed model. The best known upper bound for this problem relies on $L_1$-polynomial regression and has complexity $d^{tilde{O}(1/σ^2) log(1/ε)}$, where $σ$ is the smoothing parameter and $ε$ is the excess error. […]