Avec cedram.org  english version  

Table des matières de ce volume  Article précédent
Min Li; Xiaoming Yuan The augmented Lagrangian method with full Jacobian decomposition and logarithmicquadratic proximal regularization for multipleblock separable convex programming SMAIJournal of computational mathematics, 4 (2018), p. 81120 Article PDF Class. Math.: 90C25, 90C33, 65K05 Mots clés: convex programming, splitting methods, augmented Lagrangian method, logarithmicquadratic proximal, parallel computation, convergence rate Abstract We consider a separable convex minimization model whose variables are coupled by linear constraints and they are subject to the positive orthant constraints, and its objective function is in form of $m$ functions without coupled variables. It is well recognized that when the augmented Lagrangian method (ALM) is applied to solve some concrete applications, the resulting subproblem at each iteration should be decomposed to generate solvable subproblems. When the GaussSeidel decomposition is implemented, this idea has inspired the alternating direction method of multiplier (for $m=2$) and its variants (for $m\ge 3$). When the Jacobian decomposition is considered, it has been shown that the ALM with Jacobian decomposition in its subproblem is not necessarily convergent even when $m=2$ and it was suggested to regularize the decomposed subproblems with quadratic proximal terms to ensure the convergence. In this paper, we focus on the multipleblock case with $m\ge 3$. We consider implementing the full Jacobian decomposition to ALM’s subproblems and using the logarithmicquadratic proximal (LQP) terms to regularize the decomposed subproblems. The resulting subproblems are all unconstrained minimization problems because the positive orthant constraints are all inactive; and they are fully eligible for parallel computation. Accordingly, the ALM with full Jacobian decomposition and LQP regularization is proposed. We also consider its inexact version which allows the subproblems to be solved inexactly. For both the exact and inexact versions, we comprehensively discuss their convergence, including their global convergence, worstcase convergence rates measured by the iterationcomplexity in both the ergodic and nonergodic senses, and linear convergence rates under additional assumptions. Some preliminary numerical results are reported to demonstrate the efficiency of the ALM with full Jacobian decomposition and LQP regularization. Bibliographie [2] A. Auslender & M. Teboulle, “Interior projectionlike methods for monotone variational inequalities”, Math. Program. 104 (2005), p. 3968 [3] A. Auslender, M. Teboulle & S. BenTiba, “A logarithmicquadratic proximal method for variational inequalities”, Comput. Optim. Appl. 12 (1999), p. 3140 [4] D. P. Bertsekas, Constrained Optimization and Lagrange Multiplier Methods, Academic Press, New York, 1982 [5] S. Boyd, N. Parikh, E. Chu, B. Peleato & J. Eckstein, “Distributed optimization and statistical learning via the alternating direction method of multipliers”, Foundations and Trends in Machine Learning 3 (2010), p. 1122 [6] C. H. Chen, B. S. He, Ye Y. Y. & X. M. Yuan, “The direct extension of ADMM for multiblock convex minimization problems is not necessarily convergent”, Math. Program., Ser A 155 (2016), p. 5779 [7] C. H. Chen, M. Li & X. M. Yuan, “Further study on the convergence rate of alternating direction method of multipliers with logarithmicquadratic proximal regularization”, J. Optim. Theory Appli. 166 (2015), p. 906929 [8] P. L. Combettes & J. C. Pesquet, Proximal splitting methods in signal processing, in H. H. Bauschke, R. S. Burachik, P. L. Combettes, V. Elser, D. R. Luke, H. Wolkowicz, éd., FixedPoint Algorithms for Inverse Problems in Science and Engineering, Springer, 2011, p. 185212 [9] D. Davis & W. T. Yin, Convergence rate analysis of several splitting schemes, in R. Glowinski, S. J. Osher, W. T. Yin, éd., Splitting Methods in Communication, Imaging, Science, and Engineering, Springer, 2017, p. 115163 [10] W. Deng, M. J. Lai, Z. M. Peng & W. T. Yin, “Parallel multiblock ADMM with $o(1/k)$ convergence”, J. Sci. Comput. 71 (2017), p. 712736 [11] B. C. Eaves, “On the basic theorem of complementarity”, Math. Program. 1 (1971), p. 6875 [12] J. Eckstein & D. P. Bertsekas, “On the DouglasRachford splitting method and the proximal point algorithm for maximal monotone operators”, Math. Program. 55 (1992), p. 293318 [13] J. Eckstein & W. Yao, “Augmented Lagrangian and alternating direction methods for convex optimization: A tutorial and some illustrative computational results”, RUTCOR Research Report RRR 322012, 2012 [14] F. Facchinei & J. S. Pang, FiniteDimensional Variational Inequalities and Complementarity Problems. Vol. I. Springer Series in Operations Research, SpringerVerlag, New York, 2003 [15] R. Glowinski, Numerical Methods for Nonlinear Variational Problems, SpringerVerlag, New York, Berlin, Heidelberg, Tokyo, 1984 [16] R. Glowinski, On alternating direction methods of multipliers: A historical perspective, Modeling, Simulation and Optimization for Science and Technology, Volume 34 of the series Computational Methods in Applied Sciences, Springer, 2014, p. 5982 [17] R. Glowinski & A. Marrocco, “Approximation par éléments finis d’ordre un et résolution par pénalisationdualité d’une classe de problèmes non linéaires”, R.A.I.R.O. R2 (1975), p. 4176 [18] R. Glowinski & P. Le Tallec, Augmented Lagrangian and OperatorSplitting Methods in Nonlinear Mechanics, SIAM Studies in Applied Mathematics, Philadelphia, PA, 1989 [19] E. G. Gol’shtein & N. V. Tret’yakov, “Modified Lagrangian in convex programming and their generalizations”, Math. Program. Studies 10 (1979), p. 8697 [20] D. R. Han, X. M. Yuan & W. X. Zhang, “An augmentedLagrangianbased parallel splitting method for separable convex programming with applications to image processing”, Math. Comput. 83 (2014), p. 22632291 [21] P. C. Hansen, J. G. Nagy & D. P. O’Leary, Deblurring Images: Matrices, Spectra, and Filtering, SIAM, Philadelphia, 2006 [22] B. S. He, L. S. Hou & X. M. Yuan, “On full Jacobian decomposition of the augmented Lagrangian method for separable convex programming”, SIAM J. Optim. 25 (2015), p. 22742312 [23] B. S. He, H. Liu, Wang Z. R. & X. M. Yuan, “A strictly contractive PeacemanRachford splitting method for convex programming”, SIAM J. Optim. 24 (2014), p. 11011140 [24] B. S. He, M. Tao & X. M. Yuan, “Alternating direction method with Gaussian back substitution for separable convex programming”, SIAM J. Optim. 22 (2012), p. 313340 [25] B. S. He, H. K. Xu & X. M. Yuan, “On the proximal Jacobian decomposition of ALM for multipleblock separable convex minimization problems and its relationship to ADMM”, J. Sci. Comput. 66 (2016), p. 12041217 [26] B. S. He & X. M. Yuan, “On the O($1/n$) convergence rate of DouglasRachford alternating direction method”, SIAM J. Numer. Anal. 50 (2012), p. 700709 [27] B. S. He & X. M. Yuan, “On nonergodic convergence rate of DouglasRachford alternating direction method of multipliers”, Numer. Math. 130 (2015), p. 567577 [28] M. Hong & Z. Q. Luo, “On the linear convergence of the alternating direction method of multipliers”, Math. Program. 162 (2017), p. 165199 [29] M. Li, L.Z. Liao & X. M. Yuan, “Inexact alternating direction method of multipliers with logarithmicquadratic proximal regularization”, J. Optim. Theory Appli. 159 (2013), p. 412436 [30] M. Li, D. F. Sun & K.C. Toh, “A majorized ADMM with indefinite proximal terms for linearly constrained convex composite optimization”, SIAM J. Optim. 26 (2016), p. 922950 [31] M. Li & X. M. Yuan, “A strictly contractive PeacemanRachford splitting method with logarithmicquadratic proximal regularization for convex programming”, Math. Oper. Res. 40 (2015), p. 842858 [32] T. Y. Lin, S. Q. Ma & S. Z. Zhang, “On the global linear convergence of the ADMM with multiblock variables”, SIAM J. Optim. 25 (2015), p. 14781497 [33] T. Y. Lin, S. Q. Ma & S. Z. Zhang, “On the sublinear convergence rate of multiblock ADMM”, J. Oper. Res. Soc. China 3 (2015), p. 251274 [34] B. Martinet, “Regularization d’inequations variationelles par approximations successives”, Revue Francaise d’Informatique et de Recherche Opérationelle 4 (1970), p. 154159 [35] J. G. Melo & R. D. C. Monteiro, “Iterationcomplexity of a Jacobitype nonEuclidean ADMM for multiblock linearly constrained nonconvex programs”, arXiv:1705.07229v1, 2017 [36] A. S. Nemirovsky & D. B. Yudin, Problem Complexity and Method Efficiency in Optimization, WileyInterscience Series in Discrete Mathematics, John Wiley & Sons, New York, 1983 [37] Y. E. Nesterov, “A method for unconstrained convex minimization problem with the rate of convergence O($1/{k^2}$)”, Doklady AN SSSR 269 (1983), p. 543547 [38] Y. E. Nesterov, “Gradient methods for minimizing composite objective function”, Math. Program., Ser. B 140 (2013), p. 125161 [39] M. Patriksson, “A survey on the continuous nonlinear resource allocation problem”, European J. Oper. Res. 185 (2008), p. 146 [40] Y. G. Peng, A. Ganesh, J. Wright, W. L. Xu & Y. Ma, “Robust alignment by sparse and lowrank decomposition for linearly correlated images”, IEEE Tran. Pattern Anal. Mach. Intel. 34 (2012), p. 22332246 [41] M. J. D. Powell, A method for nonlinear constraints in minimization problems, in R. Fletcher, éd., Optimization, Academic Press, New York, 1969, p. 283298 [42] R. T. Rockafellar, “Augmented Lagrangians and applications of the proximal point algorithm in convex programming”, Math. Oper. Res. 1 (1976), p. 97116 [43] M. Tao & X. M. Yuan, “Recovering lowrank and sparse components of matrices from incomplete and noisy observations”, SIAM J. Optim. 21 (2011), p. 5781 [44] M. Tao & X. M. Yuan, “On the O$(1/t)$ convergence rate of alternating direction method with logarithmicquadratic proximal regularization”, SIAM J. Optim. 22 (2012), p. 14311448 [45] H. Uzawa, “Market mechanisms and mathematical programming”, Econometrica 28 (1960), p. 872881 [46] X. M. Yuan & M. Li, “An LQPbased decomposition method for solving a class of variational inequalities”, SIAM J. Optim 21 (2011), p. 13091318 
