Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Privacy Regulation and Protection in Machine Learning

Efficient Private Federated Non-Convex Optimization With Shuffled Model

Lingxiao Wang · Xingyu Zhou · Kumar Kshitij Patel · Lawrence Tang · Aadirupa Saha


Abstract:

This paper studies the problem of distributed non-convex optimization under privacy requirements. We develop a differentially private communication efficient algorithm and study its privacy and utility trade-offs. By introducing the shuffled model into our algorithmic design, we are able to achieve strong privacy and utility guarantees without relying on a trusted central server. We further show that our proposed method can achieve improved utility guarantees (faster convergence rates) compared to previous approaches. Additionally, we present preliminary experimental results to corroborate our theoretical findings.

Chat is not available.