Abstract
In this paper, two new subspace minimization conjugate gradient methods based on p-regularization models are proposed, where a special scaled norm in p-regularization model is analyzed. Different choices of special scaled norm lead to different solutions to the p-regularized subproblem. Based on the analyses of the solutions in a two-dimensional subspace, we derive new directions satisfying the sufficient descent condition. With a modified nonmonotone line search, we establish the global convergence of the proposed methods under mild assumptions. R-linear convergence of the proposed methods is also analyzed. Numerical results show that, for the CUTEr library, the proposed methods are superior to four conjugate gradient methods, which were proposed by Hager and Zhang (SIAM J. Optim. 16(1):170–192, 2005), Dai and Kou (SIAM J. Optim. 23(1):296–320, 2013), Liu and Liu (J. Optim. Theory. Appl. 180(3):879–906, 2019) and Li et al. (Comput. Appl. Math. 38(1):2019), respectively.
Similar content being viewed by others
References
Andrei, N.: An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algorithm. 65, 859–874 (2014)
Andrea, C., Tayebeh, D.N., Stefano, L.: On global minimizers of quadratic functions with cubic regularization. Optim. Lett. 13, 1269–1283 (2019)
Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)
Bellavia, S., Morini, B., Cartis, C., Gould, N.I.M., Toint, P.L.: Convergence of a regularized euclidean residual algorithm for nonlinear least-squares. SIAM J. Numer. Anal. 48, 1–29 (2010)
Bellavia, S., Morini, B.: Strong local convergence properties of adaptive regularized methods for nonlinear least squares. IMA J. Numer. Anal. 35, 947–968 (2014)
Benson, H.Y., Shanno, D.F.: Interior-point methods for nonconvex nonlinear programming:cubic regularization. Comput. Optim. Appl. 58, 323–346 (2014)
Bianconcini, T., Liuzzi, G., Morini, B., Sciandrone, M.: On the use of iterative methods in cubic regularization for unconstrained optimization. Comput. Optim. Appl. 60, 35–57 (2015)
Bianconcini, T., Sciandrone, M.: A cubic regularization algorithm for unconstrained optimization using line search and nonmonotone techniques. Optim. Methods Softw. 31, 1008–1035 (2016)
Birgin, E.G., Gardenghi, J.L., Martínez, J. M., Santos, S.A., Toint, P.L.: Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models. Math. Program. 163, 359–368 (2017)
Cartis, C., Gould, N.I.M., Toint, P.L.: Adaptive cubic regularisation methods for unconstrained optimization. Part i: motivation, convergence and numerical results. Math. Program. 127, 245–295 (2011)
Cartis, C., Gould, N.I.M., Toint, P.L.: Adaptive cubic regularisation methods for unconstrained optimization. Part II: worst-case function-and derivative-evaluation complexity. Math. Program. 130, 295–319 (2011)
Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)
Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)
Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23, 296–320 (2013)
Dai, Y.H., Kou, C.X.: A Barzilai-Borwein conjugate gradient method. Sci. China Math. 59, 1511–1524 (2016)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Dussault, J.P.: Simple Unified Convergence Proofs for the Trust-Region and a New ARC variant.Tech. Rep., University of Sherbrooke, Sherbrooke, Canada (2015)
Fatemi, M.: A new efficient conjugate gradient method for unconstrained optimization. J. Comput. Appl. Math. 300, 207–216 (2016)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)
Gould, N.I.M., Orban, D.: CUTEr and SifDec: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29, 373–394 (2003)
Gould, N.I.M., Porcelli, M.: Updating the regularization parameter in the adaptive cubic regularization algorithm. Comput. Optim. Appl. 53, 1–22 (2012)
Gould, N.I.M., Robinson, D.P., Thorne, H.S.: On solving trust-region and other regularised subproblems in optimization. Math. Program. Comput. 2, 21–57 (2010)
Griewank, A.: The modification of Newton’s method for unconstrained optimization by bounding cubic terms. Technical Report NA/12, Department of Applied Mathematics and Theoretical Physics University of Cambridge (1981)
Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)
Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)
Hager, W.W., Zhang, H.: The limited memory conjugate gradient method. SIAM J. Optim. 23, 2150–2168 (2013)
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear system. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)
Li, L.B.: A new algorithm for solving large scale trust region subproblem. Oper. Res. Manag. Sci. 16, 48–52 (2007)
Li, M., Liu, H.W., Liu, Z.X.: A new subspace minimization conjugate gradient method with non- monotone line search for unconstrained optimization. Numer. Algorithms. 79, 195–219 (2018)
Li, Y.F., Liu, Z.X., Liu, H.W.: A subspace minimization conjugate gradient method based on conic model for unconstrained optimization. Comput. Appl. Math. 38 (2019)
Liu, H.W., Liu, Z.X.: An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization. J. Optim. Theory Appl. 180, 879–906 (2019)
Liu, Z.X., Liu, H.W.: Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization. J. Comput. Appl. Math. 328, 400–413 (2018)
Liu, Z.X., Liu, H.W.: An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numer. Algorithms. 78, 21–39 (2018)
Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45, 503–528 (1989)
Necoara, I., Nesterov, Y. u., Glineur, F.: Linear convergence of first order methods for non-strongly convex optimization. Math. Program., Ser. A. 175, 69–107 (2018)
Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton’s method and its global performance. Math. Program. 108, 177–205 (2006)
Nathan, C., Autar, K., Jai, P., Michael, K.: Newton-Raphson Method-Graphical Simulation of the Method. University of South Florida. http://numericalmethods.eng.usf.deu/mws (2003)
Nesterov, Y.: Accelerating the cubic regularization of Newtons method on convex problems. Math. Program. 112, 159–181 (2008)
Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9, 94–112 (1969)
Radosaw, P.: Conjugate Gradient Algorithms in Nonconvex Optimization. Springer, Berlin (2009)
Rivaie, M., Mamat, M., Abashar, A.: A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches. Appl. Math. Comput. 268, 1152–1163 (2015)
Sun, W.Y.: On nonquadratic model optimization methods. Asia Pac. J. Oper. Res. 13, 43–63 (1996)
Weiser, M., Deuflhard, P., Erdmann, B.: Affine conjugate adaptive Newton methods for nonlinear elastomechanics. Optim. Methods Softw. 22, 413–431 (2007)
Yang, Y.T., Chen, Y.T., Lu, Y.L.: A subspace conjugate gradient algorithm for largescale unconstrained optimization. Numer. Algorithm. 76, 813–828 (2017)
Hsia, Y., Sheu, R.L., Yuan, Y.X.: Theory and application of p-regularized subproblems for p > 2. Optim Methods Softw. 1059–1077 (2017)
Yuan, Y.X., Sun, W.Y.: Optimization Theory and Methods. Science Press, Beijing (1997)
Yuan, Y.X., Stoer, J.: A subspace study on conjugate gradient algorithms. Z. Angew. Math. Mech. 75, 69–77 (1995)
Yuan, Y.X.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11, 325–332 (1991)
Yuan, Y.X.: A review on subspace methods for nonlinear optimization. In: Proceedings of the International Congress of Mathematics 2014, Seoul, Korea, pp 807–827 (2014)
Yuan, Y.X.: Subspace methods for large scale nonlinear equations and nonlinear least squares. Optim. Eng. 10, 207–218 (2009)
Zhang, H., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14, 1043–1056 (2004)
Zhou, B., Gao, L., Dai, Y.H.: Gradient methods with adaptive stepsizes. Comput. Optim. Appl. 35, 69–86 (2006)
Acknowledgments
We would like to thank the anonymous referees for their useful comments. We also would like to thank Professors Hager,W.W. and Zhang, H.C. for their CG_DESCENT (5.3) code, and thank Professor Dai, Y.H and Dr. Kou, C.X. for their CGOPT code.
Funding
This research is supported by the National Science Foundation of China (No. 11901561), Guangxi Natural Science Foundation (No. 2018GXNSFBA281180) and China Postdoctoral Science Foundation (2019M660833).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Zhao, T., Liu, H. & Liu, Z. New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization. Numer Algor 87, 1501–1534 (2021). https://doi.org/10.1007/s11075-020-01017-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11075-020-01017-1
Keywords
- Conjugate gradient method
- p-regularization model
- Subspace technique
- Nonmonotone line search
- Unconstrained optimization