Faster Algorithms for User-Level Private Stochastic Convex Optimization

Editor
2 Min Read


We study private stochastic convex optimization (SCO) under user-level differential privacy (DP) constraints. In this setting, there are nn users, each possessing mm data items, and we need to protect the privacy of each user’s entire collection of data items. Existing algorithms for user-level DP SCO are impractical in many large-scale machine learning scenarios because: (i) they make restrictive assumptions on the smoothness parameter of the loss function and require the number of users to grow polynomially with the dimension of the parameter space; or (ii) they are prohibitively slow, requiring at least (mn)3/2(mn)^{3/2}

Share this Article
Please enter CoinGecko Free Api Key to get this plugin works.