Derivatives of Stochastic Gradient Descent in parametric optimization - ANITI - Artificial and Natural Intelligence Toulouse Institute
Communication Dans Un Congrès Année : 2024

Derivatives of Stochastic Gradient Descent in parametric optimization

Résumé

We consider stochastic optimization problems where the objective depends on some parameter, as commonly found in hyperparameter optimization for instance. We investigate the behavior of the derivatives of the iterates of Stochastic Gradient Descent (SGD) with respect to that parameter and show that they are driven by an inexact SGD recursion on a different objective function, perturbed by the convergence of the original SGD. This enables us to establish that the derivatives of SGD converge to the derivative of the solution mapping in terms of mean squared error whenever the objective is strongly convex. Specifically, we demonstrate that with constant step-sizes, these derivatives stabilize within a noise ball centered at the solution derivative, and that with vanishing step-sizes they exhibit $O(\log(k)^2 / k)$ convergence rates. Additionally, we prove exponential convergence in the interpolation regime. Our theoretical findings are illustrated by numerical experiments on synthetic tasks.
Fichier principal
Vignette du fichier
stochastic-autodiff.pdf (1.95 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04582212 , version 1 (21-05-2024)
hal-04582212 , version 2 (20-11-2024)

Licence

Identifiants

  • HAL Id : hal-04582212 , version 2

Citer

Franck Iutzeler, Edouard Pauwels, S. Vaiter. Derivatives of Stochastic Gradient Descent in parametric optimization. Advances in Neural Information Processing Systems (NeurIPS), Dec 2024, Vancouver, Canada, Canada. 24 p. ⟨hal-04582212v2⟩
80 Consultations
232 Téléchargements

Partager

More