期刊文献+

Differentially private SGD with random features

下载PDF
导出
摘要 In the realm of large-scale machine learning,it is crucial to explore methods for reducing computational complexity and memory demands while maintaining generalization performance.Additionally,since the collected data may contain some sensitive information,it is also of great significance to study privacy-preserving machine learning algorithms.This paper focuses on the performance of the differentially private stochastic gradient descent(SGD)algorithm based on random features.To begin,the algorithm maps the original data into a lowdimensional space,thereby avoiding the traditional kernel method for large-scale data storage requirement.Subsequently,the algorithm iteratively optimizes parameters using the stochastic gradient descent approach.Lastly,the output perturbation mechanism is employed to introduce random noise,ensuring algorithmic privacy.We prove that the proposed algorithm satisfies the differential privacy while achieving fast convergence rates under some mild conditions.
出处 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2024年第1期1-23,共23页 高校应用数学学报(英文版)(B辑)
基金 supported by Zhejiang Provincial Natural Science Foundation of China(LR20A010001) National Natural Science Foundation of China(12271473 and U21A20426)。
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部