期刊文献+

Pseudo-label based semi-supervised learning in the distributed machine learning framework

下载PDF
导出
摘要 With the emergence of various intelligent applications,machine learning technologies face lots of challenges including large-scale models,application oriented real-time dataset and limited capabilities of nodes in practice.Therefore,distributed machine learning(DML) and semi-supervised learning methods which help solve these problems have been addressed in both academia and industry.In this paper,the semi-supervised learning method and the data parallelism DML framework are combined.The pseudo-label based local loss function for each distributed node is studied,and the stochastic gradient descent(SGD) based distributed parameter update principle is derived.A demo that implements the pseudo-label based semi-supervised learning in the DML framework is conducted,and the CIFAR-10 dataset for target classification is used to evaluate the performance.Experimental results confirm the convergence and the accuracy of the model using the pseudo-label based semi-supervised learning in the DML framework.Given the proportion of the pseudo-label dataset is 20%,the accuracy of the model is over 90% when the value of local parameter update steps between two global aggregations is less than 5.Besides,fixing the global aggregations interval to 3,the model converges with acceptable performance degradation when the proportion of the pseudo-label dataset varies from 20% to 80%.
作者 WANG Xiaoxi WU Wenjun YANG Feng SI Pengbo ZHANG Xuanyi ZHANG Yanhua 王晓曦;WU Wenjun;YANG Feng;SI Pengbo;ZHANG Xuanyi;ZHANG Yanhua(Faculty of Information Technology,Beijing University of Technology,Beijing 100124,P.R.China;Beijing Capital International Airport Co.,Ltd.,Beijing 101317,P.R.China)
出处 《High Technology Letters》 EI CAS 2022年第2期172-180,共9页 高技术通讯(英文版)
基金 Supported by the National Key R&D Program of China(No.2020YFC1807904) the Natural Science Foundation of Beijing Municipality(No.L192002) the National Natural Science Foundation of China(No.U1633115)。
  • 相关文献

参考文献1

二级参考文献1

共引文献57

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部