Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on Distributed and Private Machine Learning

Privacy Amplification via Iteration for Shuffled and Online PNSGD

Matteo Sordello · Zhiqi Bu · Jinshuo Dong · Weijie J Su


Abstract:

We consider the framework of privacy amplification via iteration, originally proposed by Feldman et al. and then simplified by Asoodeh et al. in their analysis via contraction coefficient. This line of work studies the privacy guarantees obtained by the projected noisy stochastic gradient descent (PNSGD) algorithm with hidden intermediate updates. Here, we first prove a privacy guarantee for shuffled PNSGD, which is investigated asymptotically when the noise is fixed for each individual but reduced as the sample size n grows. We then provide a faster decaying scheme for the magnitude of the injected noise that also guarantees the convergence of privacy loss when new data are received in an online fashion.

Chat is not available.