针对目标函数中包含耦合函数H(x,y)的非凸非光滑极小化问题,提出了一种线性惯性交替乘子方向法(Linear Inertial Alternating Direction Method of Multipliers,LIADMM)。为了方便子问题的求解,对目标函数中的耦合函数H(x,y)进行线性化...针对目标函数中包含耦合函数H(x,y)的非凸非光滑极小化问题,提出了一种线性惯性交替乘子方向法(Linear Inertial Alternating Direction Method of Multipliers,LIADMM)。为了方便子问题的求解,对目标函数中的耦合函数H(x,y)进行线性化处理,并在x-子问题中引入惯性效应。在适当的假设条件下,建立了算法的全局收敛性;同时引入满足Kurdyka-Lojasiewicz不等式的辅助函数,验证了算法的强收敛性。通过两个数值实验表明,引入惯性效应的算法比没有惯性效应的算法收敛性能更好。展开更多
The era of big data in healthcare is here, and this era will significantly improve medicine and especially oncology. However, traditional machine learning algorithms need to be promoted to solve such large-scale real-...The era of big data in healthcare is here, and this era will significantly improve medicine and especially oncology. However, traditional machine learning algorithms need to be promoted to solve such large-scale real-world problems due to a large amount of data that needs to be analyzed and the difficulty in solving problems with nonconvex nonlinear settings. We aim to minimize the composite of a smooth nonlinear function and a block-separable nonconvex function on a large number of block variables with inequality constraints. We propose a novel parallel first-order optimization method, called asynchronous block coordinate descent with time perturbation (ATP), which adopts a time perturbation technique that escapes from saddle points and sub-optimal local points. The details of the proposed method are presented with analyses of convergence and iteration complexity properties. Experiments conducted on real-world machine learning problems validate the efficacy of our proposed method. The experimental results demonstrate that time perturbation enables ATP to escape from saddle points and sub-optimal points, providing a promising way to handle nonconvex optimization problems with inequality constraints employing asynchronous block coordinate descent. The asynchronous parallel implementation on shared memory multi-core platforms indicates that the proposed algorithm, ATP, has strong scalability.展开更多
文摘针对目标函数中包含耦合函数H(x,y)的非凸非光滑极小化问题,提出了一种线性惯性交替乘子方向法(Linear Inertial Alternating Direction Method of Multipliers,LIADMM)。为了方便子问题的求解,对目标函数中的耦合函数H(x,y)进行线性化处理,并在x-子问题中引入惯性效应。在适当的假设条件下,建立了算法的全局收敛性;同时引入满足Kurdyka-Lojasiewicz不等式的辅助函数,验证了算法的强收敛性。通过两个数值实验表明,引入惯性效应的算法比没有惯性效应的算法收敛性能更好。
基金Project supported by the National Key R&D Program of China(No.2018YFB2101100)the National Natural Science Foundation of China(Nos.61806216 and 61702533)。
文摘The era of big data in healthcare is here, and this era will significantly improve medicine and especially oncology. However, traditional machine learning algorithms need to be promoted to solve such large-scale real-world problems due to a large amount of data that needs to be analyzed and the difficulty in solving problems with nonconvex nonlinear settings. We aim to minimize the composite of a smooth nonlinear function and a block-separable nonconvex function on a large number of block variables with inequality constraints. We propose a novel parallel first-order optimization method, called asynchronous block coordinate descent with time perturbation (ATP), which adopts a time perturbation technique that escapes from saddle points and sub-optimal local points. The details of the proposed method are presented with analyses of convergence and iteration complexity properties. Experiments conducted on real-world machine learning problems validate the efficacy of our proposed method. The experimental results demonstrate that time perturbation enables ATP to escape from saddle points and sub-optimal points, providing a promising way to handle nonconvex optimization problems with inequality constraints employing asynchronous block coordinate descent. The asynchronous parallel implementation on shared memory multi-core platforms indicates that the proposed algorithm, ATP, has strong scalability.