For solving minimization problems whose objective function is the sum of two functions without coupled variables and the constrained function is linear, the alternating direction method of multipliers (ADMM) has exh...For solving minimization problems whose objective function is the sum of two functions without coupled variables and the constrained function is linear, the alternating direction method of multipliers (ADMM) has exhibited its efficiency and its convergence is well understood. When either the involved number of separable functions is more than two, or there is a nonconvex function~ ADMM or its direct extended version may not converge. In this paper, we consider the multi-block sepa.rable optimization problems with linear constraints and absence of convexity of the involved component functions. Under the assumption that the associated function satisfies the Kurdyka- Lojasiewicz inequality, we prove that any cluster point of the iterative sequence generated by ADMM is a critical point, under the mild condition that the penalty parameter is sufficiently large. We also present some sufficient conditions guaranteeing the sublinear and linear rate of convergence of the algorithm.展开更多
In this paper, our focus lies on addressing a two-block linearly constrained nonseparable nonconvex optimization problem with coupling terms. The most classical algorithm, the alternating direction method of multiplie...In this paper, our focus lies on addressing a two-block linearly constrained nonseparable nonconvex optimization problem with coupling terms. The most classical algorithm, the alternating direction method of multipliers (ADMM), is employed to solve such problems typically, which still requires the assumption of the gradient Lipschitz continuity condition on the objective function to ensure overall convergence from the current knowledge. However, many practical applications do not adhere to the conditions of smoothness. In this study, we justify the convergence of variant Bregman ADMM for the problem with coupling terms to circumvent the issue of the global Lipschitz continuity of the gradient. We demonstrate that the iterative sequence generated by our approach converges to a critical point of the issue when the corresponding function fulfills the Kurdyka-Lojasiewicz inequality and certain assumptions apply. In addition, we illustrate the convergence rate of the algorithm.展开更多
针对目标函数中包含耦合函数H(x,y)的非凸非光滑极小化问题,提出了一种线性惯性交替乘子方向法(Linear Inertial Alternating Direction Method of Multipliers,LIADMM)。为了方便子问题的求解,对目标函数中的耦合函数H(x,y)进行线性化...针对目标函数中包含耦合函数H(x,y)的非凸非光滑极小化问题,提出了一种线性惯性交替乘子方向法(Linear Inertial Alternating Direction Method of Multipliers,LIADMM)。为了方便子问题的求解,对目标函数中的耦合函数H(x,y)进行线性化处理,并在x-子问题中引入惯性效应。在适当的假设条件下,建立了算法的全局收敛性;同时引入满足Kurdyka-Lojasiewicz不等式的辅助函数,验证了算法的强收敛性。通过两个数值实验表明,引入惯性效应的算法比没有惯性效应的算法收敛性能更好。展开更多
文摘For solving minimization problems whose objective function is the sum of two functions without coupled variables and the constrained function is linear, the alternating direction method of multipliers (ADMM) has exhibited its efficiency and its convergence is well understood. When either the involved number of separable functions is more than two, or there is a nonconvex function~ ADMM or its direct extended version may not converge. In this paper, we consider the multi-block sepa.rable optimization problems with linear constraints and absence of convexity of the involved component functions. Under the assumption that the associated function satisfies the Kurdyka- Lojasiewicz inequality, we prove that any cluster point of the iterative sequence generated by ADMM is a critical point, under the mild condition that the penalty parameter is sufficiently large. We also present some sufficient conditions guaranteeing the sublinear and linear rate of convergence of the algorithm.
文摘In this paper, our focus lies on addressing a two-block linearly constrained nonseparable nonconvex optimization problem with coupling terms. The most classical algorithm, the alternating direction method of multipliers (ADMM), is employed to solve such problems typically, which still requires the assumption of the gradient Lipschitz continuity condition on the objective function to ensure overall convergence from the current knowledge. However, many practical applications do not adhere to the conditions of smoothness. In this study, we justify the convergence of variant Bregman ADMM for the problem with coupling terms to circumvent the issue of the global Lipschitz continuity of the gradient. We demonstrate that the iterative sequence generated by our approach converges to a critical point of the issue when the corresponding function fulfills the Kurdyka-Lojasiewicz inequality and certain assumptions apply. In addition, we illustrate the convergence rate of the algorithm.
文摘针对目标函数中包含耦合函数H(x,y)的非凸非光滑极小化问题,提出了一种线性惯性交替乘子方向法(Linear Inertial Alternating Direction Method of Multipliers,LIADMM)。为了方便子问题的求解,对目标函数中的耦合函数H(x,y)进行线性化处理,并在x-子问题中引入惯性效应。在适当的假设条件下,建立了算法的全局收敛性;同时引入满足Kurdyka-Lojasiewicz不等式的辅助函数,验证了算法的强收敛性。通过两个数值实验表明,引入惯性效应的算法比没有惯性效应的算法收敛性能更好。