Differentially private SGD with non-smooth losses
Publication type: Journal Article
Publication date: 2022-01-01
scimago Q1
wos Q1
SJR: 2.046
CiteScore: 6.4
Impact factor: 3.2
ISSN: 10635203, 1096603X
Applied Mathematics
Abstract
In this paper, we are concerned with differentially private stochastic gradient descent (SGD) algorithms in the setting of stochastic convex optimization (SCO). Most of the existing work requires the loss to be Lipschitz continuous and strongly smooth, and the model parameter to be uniformly bounded. However, these assumptions are restrictive as many popular losses violate these conditions including the hinge loss for SVM, the absolute loss in robust regression, and even the least square loss in an unbounded domain. We significantly relax these restrictive assumptions and establish privacy and generalization (utility) guarantees for private SGD algorithms using output and gradient perturbations associated with non-smooth convex losses. Specifically, the loss function is relaxed to have an α -Hölder continuous gradient (referred to as α-Hölder smoothness ) which instantiates the Lipschitz continuity ( α = 0 ) and the strong smoothness ( α = 1 ). We prove that noisy SGD with α -Hölder smooth losses using gradient perturbation can guarantee ( ϵ , δ ) -differential privacy (DP) and attain optimal excess population risk O ( d log ( 1 / δ ) n ϵ + 1 n ) , up to logarithmic terms, with the gradient complexity O ( n 2 − α 1 + α + n ) . This shows an important trade-off between α -Hölder smoothness of the loss and the computational complexity for private SGD with statistically optimal performance. In particular, our results indicate that α -Hölder smoothness with α ≥ 1 / 2 is sufficient to guarantee ( ϵ , δ ) -DP of noisy SGD algorithms while achieving optimal excess risk with a linear gradient complexity O ( n ) .
Found
Nothing found, try to update filter.
Found
Nothing found, try to update filter.
Top-30
Journals
|
1
2
|
|
|
Applied and Computational Harmonic Analysis
2 publications, 18.18%
|
|
|
Journal of Mathematics
1 publication, 9.09%
|
|
|
Scientific Programming
1 publication, 9.09%
|
|
|
Neural Computation
1 publication, 9.09%
|
|
|
Neurocomputing
1 publication, 9.09%
|
|
|
Applied Mathematics
1 publication, 9.09%
|
|
|
Acta Mathematicae Applicatae Sinica
1 publication, 9.09%
|
|
|
IEEE Transactions on Information Forensics and Security
1 publication, 9.09%
|
|
|
1
2
|
Publishers
|
1
2
3
|
|
|
Elsevier
3 publications, 27.27%
|
|
|
Institute of Electrical and Electronics Engineers (IEEE)
3 publications, 27.27%
|
|
|
Hindawi Limited
2 publications, 18.18%
|
|
|
Springer Nature
2 publications, 18.18%
|
|
|
MIT Press
1 publication, 9.09%
|
|
|
1
2
3
|
- We do not take into account publications without a DOI.
- Statistics recalculated weekly.
Are you a researcher?
Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
11
Total citations:
11
Citations from 2024:
8
(72.73%)
Cite this
GOST |
RIS |
BibTex
Cite this
GOST
Copy
Wang P. et al. Differentially private SGD with non-smooth losses // Applied and Computational Harmonic Analysis. 2022. Vol. 56. pp. 306-336.
GOST all authors (up to 50)
Copy
Wang P., Lei Y., Ying Y., Zhang H. Differentially private SGD with non-smooth losses // Applied and Computational Harmonic Analysis. 2022. Vol. 56. pp. 306-336.
Cite this
RIS
Copy
TY - JOUR
DO - 10.1016/j.acha.2021.09.001
UR - https://doi.org/10.1016/j.acha.2021.09.001
TI - Differentially private SGD with non-smooth losses
T2 - Applied and Computational Harmonic Analysis
AU - Wang, P
AU - Lei, Y
AU - Ying, Y
AU - Zhang, H
PY - 2022
DA - 2022/01/01
PB - Elsevier
SP - 306-336
VL - 56
SN - 1063-5203
SN - 1096-603X
ER -
Cite this
BibTex (up to 50 authors)
Copy
@article{2022_Wang,
author = {P Wang and Y Lei and Y Ying and H Zhang},
title = {Differentially private SGD with non-smooth losses},
journal = {Applied and Computational Harmonic Analysis},
year = {2022},
volume = {56},
publisher = {Elsevier},
month = {jan},
url = {https://doi.org/10.1016/j.acha.2021.09.001},
pages = {306--336},
doi = {10.1016/j.acha.2021.09.001}
}