摘要
The primal-dual hybrid gradient method is a classic way to tackle saddle-point problems.However,its convergence is not guaranteed in general.Some restric-tions on the step size parameters,e.g.,τσ≤1/||A^(T)A||,are imposed to guarantee the convergence.In this paper,a new convergent method with no restriction on parame-ters is proposed.Hence the expensive calculation of ||A^(T)A|| is avoided.This method produces a predictor like other primal-dual methods but in a parallel fashion,which has the potential to speed up the method.This new iterate is then updated by a sim-ple correction to guarantee the convergence.Moreover,the parameters are adjusted dynamically to enhance the efficiency as well as the robustness of the method.The generated sequence monotonically converges to the solution set.A worst-case O(1/t)convergence rate in ergodic sense is also established under mild assumptions.The nu-merical efficiency of the proposed method is verified by applications in LASSO problem and Steiner tree problem.
基金
This research is supported by National Natural Science Foundation of China(Nos.71201080,71571096)
Social Science Foundation of Jiang-su Province(No.14GLC001)
Fundamental Research Funds for the Central Universities(No.020314380016).