摘要
联邦学习是一种分布式机器学习范式,中央服务器通过协作大量远程设备训练一个最优的全局模型。目前联邦学习主要存在系统异构性和数据异构性这两个关键挑战。本文主要针对异构性导致的全局模型收敛慢甚至无法收敛的问题,提出基于隐式随机梯度下降优化的联邦学习算法。与传统联邦学习更新方式不同,本文利用本地上传的模型参数近似求出平均全局梯度,同时避免求解一阶导数,通过梯度下降来更新全局模型参数,使全局模型能够在较少的通信轮数下达到更快更稳定的收敛结果。在实验中,模拟了不同等级的异构环境,本文提出的算法比FedProx和FedAvg均表现出更快更稳定的收敛结果。在相同收敛结果的前提下,本文的方法在高度异构的合成数据集上比FedProx通信轮数减少近50%,显著提升了联邦学习的稳定性和鲁棒性。
Federated learning is a distributed machine learning paradigm.The central server trains an optimal global model by collaborating with numerous remote devices.Presently,there are two key challenges faced by federated learning:system and statistical heterogeneities.Herein,we mainly focus on the slow convergence of the global model or when it even fails to converge due to system and statistical heterogeneities.We propose a federated learning optimization algorithm based on implicit stochastic gradient descent optimization,which is different from the traditional method of updating in federated learning.We use the locally uploaded model parameters to approximate the average global gradient and to avoid solving the first-order and update the global model parameter via gradient descent.This is performed so that the global model can achieve faster and more stable convergence results with fewer communication rounds.In the experiment,different levels of heterogeneous settings were simulated.The proposed algorithm shows considerably faster and more stable convergence behavior than FedAvg and FedProx.In the premise of the same convergence results,the experimental results show that the proposed method reduces the number of communication rounds by approximately 50%compared with Fedprox in highly heterogeneous synthetic datasets.This considerably improves the stability and robustness of federated learning.
作者
窦勇敢
袁晓彤
DOU Yonggan;YUAN Xiaotong(School of Automation,Nanjing University of Information Science and Technology,Nanjing 210044,China;Jiangsu Key Laboratory of Big Data Analysis Technology,Nanjing 210044,China)
出处
《智能系统学报》
CSCD
北大核心
2022年第3期488-495,共8页
CAAI Transactions on Intelligent Systems
基金
国家自然科学基金项目(61876090,61936005)
科技创新2030-“新一代人工智能”重大项目(2018AAA0100400).
关键词
联邦学习
分布式机器学习
中央服务器
全局模型
隐式随机梯度下降
数据异构
系统异构
优化算法
快速收敛
federated learning
distributed machine learning
central server
global model
implicit stochastic gradient descent
statistical heterogeneity
systems heterogeneity
optimization algorithm
faster convergence