Private Stochastic Convex Optimization with Heavy Tails: Near-Optimality from Simple Reductions

Editor
2 Min Read


We study the problem of differentially private stochastic convex optimization (DP-SCO) with heavy-tailed gradients, where we assume a kthk^{\text{th}}

Share this Article
Please enter CoinGecko Free Api Key to get this plugin works.