User-level Differentially Private Stochastic Convex Optimization: Efficient Algorithms with Optimal Rates
AuthorsHilal Asi, Daogao Liu
AuthorsHilal Asi, Daogao Liu
We study differentially private stochastic convex optimization (DP-SCO) under user-level privacy, where each user may hold multiple data items. Existing work for user-level DP-SCO either requires super-polynomial runtime or requires a number of users that grows polynomially with the dimensionality of the problem. We develop new algorithms for user-level DP-SCO that obtain optimal rates, run in polynomial time, and require a number of users that grow logarithmically in the dimension. Moreover, our algorithms are the first to obtain optimal rates for non-smooth functions in polynomial time. These algorithms are based on multiple-pass DP-SGD, combined with a novel private mean estimation procedure for concentrated data, which applies an outlier removal step before estimating the mean of the gradients.
November 20, 2024research area Methods and Algorithms, research area Privacyconference NeurIPS
November 15, 2022research area Methods and Algorithms, research area Privacyconference NeurIPS