Proximal Quasi-Newton Method for Composite Optimization over the Stiefel Manifold | Journal of Scientific Computing Skip to main content
Log in

Proximal Quasi-Newton Method for Composite Optimization over the Stiefel Manifold

  • Published:
Journal of Scientific Computing Aims and scope Submit manuscript

Abstract

In this paper, we consider the composite optimization problems over the Stiefel manifold. A successful method to solve this class of problems is the proximal gradient method proposed by Chen et al. (SIAM J Optim 30:210–239, 2020. https://doi.org/10.1137/18M122457X). Motivated by the proximal Newton-type techniques in the Euclidean space, we present a Riemannian proximal quasi-Newton method, named ManPQN, to solve the composite optimization problems. The global convergence of the ManPQN method is proved and iteration complexity for obtaining an \(\epsilon \)-stationary point is analyzed. Under some mild conditions, we also establish the local linear convergence result of the ManPQN method. Numerical results are encouraging, which shows that the proximal quasi-Newton technique can be used to accelerate the proximal gradient method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Data Availability

Enquiries about data availability should be directed to the authors.

Notes

  1. Our MATLAB code is available at https://github.com/QinsiWang2022/ManPQN.

References

  1. Absil, P.-A., Hosseini, S.: A collection of nonsmooth Riemannian optimization problems. Int. Ser. Numer. Math. 170, 1–15 (2019). https://doi.org/10.1007/978-3-030-11370-4_1

  2. Absil, P.-A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton (2008)

    Book  MATH  Google Scholar 

  3. Beck, A., Teboulle, M.: A fast iterative shrinkage-threshold algorithm for linear inverse problems. SIAM J. Imaging Sci. 2, 183–202 (2009). https://doi.org/10.1137/080716542

    Article  MathSciNet  MATH  Google Scholar 

  4. Bento, G., Cruz Neto, J., Oliveira, P.: A new approach to the proximal point method: convergence on general Riemannian manifolds. J. Optim. Theory Appl. 168, 743–755 (2016). https://doi.org/10.1007/s10957-015-0861-2

    Article  MathSciNet  MATH  Google Scholar 

  5. Boumal, N., Absil, P.-A., Cartis, C.: Global rates of convergence for nonconvex optimization on manifolds. IMA J. Numer. Anal. 39(1), 1–33 (2019). https://doi.org/10.1093/imanum/drx080

    Article  MathSciNet  MATH  Google Scholar 

  6. Chen, S., Ma, S., So, A.M.-C., Zhang, T.: Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM J. Optim. 30(1), 210–239 (2020). https://doi.org/10.1137/18M122457X

    Article  MathSciNet  MATH  Google Scholar 

  7. Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. J. Syst. Sci. Complex. 15(2), 139–145 (2002)

    MathSciNet  MATH  Google Scholar 

  8. Davis, T.A., Hu, Y.: The University of Florida sparse matrix collection. ACM Trans. Math. Softw. (TOMS) 38(1), 1–25 (2011)

    MathSciNet  MATH  Google Scholar 

  9. Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. J. Optim. Theory Appl. 97, 93–104 (1998). https://doi.org/10.1023/A:1022675100677

    Article  MathSciNet  MATH  Google Scholar 

  10. Ferreira, O., Oliveira, P.: Proximal point algorithm on Riemannian manifolds. Optimization 51, 257–270 (2002). https://doi.org/10.1080/02331930290019413

    Article  MathSciNet  MATH  Google Scholar 

  11. Fukushima, M., Qi, L.: A globally and superlinearly convergent algorithm for nonsmooth convex minimization. SIAM J. Optim. 6(4), 1106–1120 (1996). https://doi.org/10.1137/S1052623494278839

    Article  MathSciNet  MATH  Google Scholar 

  12. Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM J. Sci. Comput. 41(3), 1949–1983 (2019). https://doi.org/10.1137/18M1221679

    Article  MathSciNet  MATH  Google Scholar 

  13. Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23(4), 707–716 (1986). https://doi.org/10.1137/0723046

    Article  MathSciNet  MATH  Google Scholar 

  14. Grohs, P., Hosseini, S.: \(\varepsilon \)-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Adv. Comput. Math. 42(2), 333–360 (2016). https://doi.org/10.1007/s10444-015-9426-z

    Article  MathSciNet  MATH  Google Scholar 

  15. Hosseini, S., Grohs, P.: Nonsmooth trust region algorithms for locally Lipschitz functions on Riemannian manifolds. IMA J. Numer. Anal. 36(3), 1167–1192 (2016). https://doi.org/10.1093/imanum/drv043

    Article  MathSciNet  MATH  Google Scholar 

  16. Hu, J., Milzarek, A., Wen, Z., Yuan, Y.-X.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM J. Matrix Anal. Appl. 39(3), 1181–1207 (2018). https://doi.org/10.1137/17M1142478

    Article  MathSciNet  MATH  Google Scholar 

  17. Huang, W., Wei, K.: Riemannian proximal gradient methods. Math. Program. 194, 371–413 (2022). https://doi.org/10.1007/s10107-021-01632-3

    Article  MathSciNet  MATH  Google Scholar 

  18. Huang, W., Wei, K.: An extension of fast iterative shrinkage-thresholding to Riemannian optimization for sparse principal component analysis. Numer. Linear Algebra Appl. 29, e2409 (2022). https://doi.org/10.1002/nla.2409

    Article  MathSciNet  MATH  Google Scholar 

  19. Huang, Y., Liu, H.: On the rate of convergence of projected Barzilai–Borwein methods. Optim. Methods Softw. 30(4), 880–892 (2015). https://doi.org/10.1080/10556788.2015.1004064

    Article  MathSciNet  MATH  Google Scholar 

  20. Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM J. Optim. 24(3), 1420–1443 (2014). https://doi.org/10.1137/130921428

    Article  MathSciNet  MATH  Google Scholar 

  21. Li, D., Wang, X., Huang, J.: Diagonal BFGS updates and applications to the limited memory BFGS method. Comput. Optim. Appl. 81, 829–856 (2022). https://doi.org/10.1007/s10589-022-00353-3

    Article  MathSciNet  MATH  Google Scholar 

  22. Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Math. Program. 178, 215–262 (2019). https://doi.org/10.1007/s10107-018-1285-1

    Article  MathSciNet  MATH  Google Scholar 

  23. Mordukhovich, B., Yuan, X., Zeng, S., Zhang, J.: A globally convergent proximal Newton-type method in nonsmooth convex optimization. Math. Program. 198, 899–936 (2023). https://doi.org/10.1007/s10107-022-01797-5

    Article  MathSciNet  MATH  Google Scholar 

  24. Moré, J.J., Sorensen, D.C.: Computing a trust region step. SIAM J. Sci. Stat. Comput. 4(3), 553–572 (1983). https://doi.org/10.1137/0904038

    Article  MathSciNet  MATH  Google Scholar 

  25. Nakayama, S., Narushima, Y., Yabe, H.: Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions. Comput. Optim. Appl. 79, 127–154 (2021). https://doi.org/10.1007/s10589-021-00264-9

    Article  MathSciNet  MATH  Google Scholar 

  26. Nesterov, Y.: Gradient methods for minimizing composite functions. Math. Program. Ser. B 140, 125–161 (2013). https://doi.org/10.1007/s10107-012-0629-5

    Article  MathSciNet  MATH  Google Scholar 

  27. Nesterov, Y.: Lectures on Convex Optimization, 2nd edn. Springer, Cham (2018)

    Book  MATH  Google Scholar 

  28. Nocedal, J.: Updating quasi-Newton matrices with limited storage. Math. Comput. 35, 773–782 (1980). https://doi.org/10.2307/2006193

    Article  MathSciNet  MATH  Google Scholar 

  29. Oviedo, H.: Proximal point algorithm on the Stiefel manifold. Preprint in Optimization. http://www.optimization-online.org/DB_FILE/2021/05/ 8401.pdf (2021)

  30. Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proc. Natl. Acad. Sci. USA 110(46), 18368–18373 (2013). https://doi.org/10.1073/pnas.1318679110

    Article  MathSciNet  MATH  Google Scholar 

  31. Park, Y., Dhar, S., Boyd, S., Shah, M.: Variable metric proximal gradient method with diagonal Barzilai–Borwein stepsize. In: ICASSP 2020—2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 3597–3601 (2020). https://doi.org/10.1109/ICASSP40776.2020.9054193

  32. Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978). https://doi.org/10.1007/BF01588967

    Article  MathSciNet  MATH  Google Scholar 

  33. Ring, W., Wirth, B.: Optimization methods on Riemannian manifolds and their application to shape space. SIAM J. Optim. 22(2), 596–627 (2012). https://doi.org/10.1137/11082885X

    Article  MathSciNet  MATH  Google Scholar 

  34. Sato, H.: Riemannian Newton-type methods for joint diagonalization on the Stiefel manifold with application to independent component analysis. Optimization 66(12), 2211–2231 (2017). https://doi.org/10.1080/02331934.2017.1359592

  35. Sherman, J., Morrison, W.J.: Adjustment of an inverse matrix corresponding to a change in one element of a given matrix. Ann. Math. Stat. 21, 124–127 (1950)

    Article  MathSciNet  MATH  Google Scholar 

  36. Shor, N.Z.: Application of generalized gradient descent in block programming. Cybernetics 3(3), 43–45 (1967). https://doi.org/10.1007/BF01120005

    Article  MathSciNet  Google Scholar 

  37. Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM J. Optim. 27(2), 927–956 (2017). https://doi.org/10.1137/15M1053141

    Article  MathSciNet  MATH  Google Scholar 

  38. Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Math. Program. 142, 397–434 (2013). https://doi.org/10.1007/s10107-012-0584-1

    Article  MathSciNet  MATH  Google Scholar 

  39. Wright, S., Nowak, R., Figueiredo, M.: Sparse reconstruction by separable approximation. IEEE Trans. Signal Process. 57(7), 2479–2493 (2009). https://doi.org/10.1109/TSP.2009.2016892

    Article  MathSciNet  MATH  Google Scholar 

  40. Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. J. Sci. Comput. 76, 364–389 (2018). https://doi.org/10.1007/s10915-017-0624-3

    Article  MathSciNet  MATH  Google Scholar 

  41. Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pac. J. Optim. 10, 415–434 (2013)

    MathSciNet  MATH  Google Scholar 

  42. Zhang, C., Chen, X., Ma, S.: A Riemannian smoothing steepest descent method for non-Lipschitz optimization on submanifolds (2021). ArXiv arXiv:2104.04199

Download references

Acknowledgements

The authors are grateful to the associate editor and the two anonymous referees for their valuable comments and suggestions.

Funding

The work of Wei Hong Yang was supported by the National Natural Science Foundation of China Grant 11971118.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wei Hong Yang.

Ethics declarations

Conflict of interest

The authors have no relevant financial or non-financial interests to disclose.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, Q., Yang, W.H. Proximal Quasi-Newton Method for Composite Optimization over the Stiefel Manifold. J Sci Comput 95, 39 (2023). https://doi.org/10.1007/s10915-023-02165-x

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10915-023-02165-x

Keywords

Mathematics Subject Classification

Navigation