Skip to yearly menu bar Skip to main content


In-Person Poster presentation / poster accept

Easy Differentially Private Linear Regression

Kareem Amin · Matthew Joseph · Mónica Ribero · Sergei Vassilvitskii

MH1-2-3-4 #137

Keywords: [ Linear Regression ] [ differential privacy ] [ Social Aspects of Machine Learning ]


Abstract: Linear regression is a fundamental tool for statistical analysis. This has motivated the development of linear regression methods that also satisfy differential privacy and thus guarantee that the learned model reveals little about any one data point used to construct it. However, existing differentially private solutions assume that the end user can easily specify good data bounds and hyperparameters. Both present significant practical obstacles. In this paper, we study an algorithm which uses the exponential mechanism to select a model with high Tukey depth from a collection of non-private regression models. Given $n$ samples of $d$-dimensional data used to train $m$ models, we construct an efficient analogue using an approximate Tukey depth that runs in time $O(d^2n + dm\log(m))$. We find that this algorithm obtains strong empirical performance in the data-rich setting with no data bounds or hyperparameter selection required.

Chat is not available.