Applied Mathematics, volume 39, issue 1, pages 1-23
Differentially private SGD with random features
Yi-Guang Wang
1
,
Zheng-Chu Guo
2
Publication type: Journal Article
Publication date: 2024-03-09
Journal:
Applied Mathematics
scimago Q4
SJR: 0.228
CiteScore: 1.4
Impact factor: 1.2
ISSN: 10051031, 19930445, 21527385, 21527393, 21542821
Applied Mathematics
Abstract
In the realm of large-scale machine learning, it is crucial to explore methods for reducing computational complexity and memory demands while maintaining generalization performance. Additionally, since the collected data may contain some sensitive information, it is also of great significance to study privacy-preserving machine learning algorithms. This paper focuses on the performance of the differentially private stochastic gradient descent (SGD) algorithm based on random features. To begin, the algorithm maps the original data into a low-dimensional space, thereby avoiding the traditional kernel method for large-scale data storage requirement. Subsequently, the algorithm iteratively optimizes parameters using the stochastic gradient descent approach. Lastly, the output perturbation mechanism is employed to introduce random noise, ensuring algorithmic privacy. We prove that the proposed algorithm satisfies the differential privacy while achieving fast convergence rates under some mild conditions.
Found
Are you a researcher?
Create a profile to get free access to personal recommendations for colleagues and new articles.