Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on Distributed and Private Machine Learning

Differentially Private Multi-Task Learning

Shengyuan Hu · Steven Wu · Virginia Smith


Abstract:

Many problems in machine learning rely on multi-task learning (MTL), in which the goal is to solve multiple related machine learning tasks simultaneously. MTL is particularly relevant for privacy-sensitive applications in areas such as healthcare, finance, and IoT computing, where sensitive data from multiple, varied sources are shared for the purpose of learning. In this work, we formalize notions of multi-task privacy via joint differential privacy (JDP), a relaxation of Differential Privacy (DP) for mechanism design and distributed optimization. We then propose a differentially private algorithm for the commonly-used mean-regularized MTL objective. We analyze our objective and solver, providing certifiable guarantees on both privacy and utility. Our initial work provides groundwork for privacy-preserving multi-task learning and highlights several interesting directions of future study.

Chat is not available.