Poster
Optimization with Access to Auxiliary Information
EL MAHDI CHAYTI · Sai Karimireddy
Hall 3 + Hall 2B #380
Abstract:
We investigate the fundamental optimization question of minimizing a \emph{target} function f(x)f(x), whose gradients are expensive to compute or have limited availability, given access to some \emph{auxiliary} side function h(x)h(x) whose gradients are cheap or more available. This formulation captures many settings of practical relevance, such as i) re-using batches in SGD, ii) transfer learning, iii) federated learning, iv) training with compressed models/dropout, etcetera. We propose two generic new algorithms that apply in all these settings; we also prove that we can benefit from this framework under the Hessian similarity assumption between the target and side information. A benefit is obtained when this similarity measure is small; we also show a potential benefit from stochasticity when the auxiliary noise is correlated with that of the target function.
Live content is unavailable. Log in and register to view live content