Abstract
We extend the result on the spectral projected gradient method by Birgin et al. in 2000 to a log-determinant semidefinite problem with linear constraints and propose a spectral projected gradient method for the dual problem. Our method is based on alternate projections on the intersection of two convex sets, which first projects onto the box constraints and then onto a set defined by a linear matrix inequality. By exploiting structures of the two projections, we show that the same convergence properties can be obtained for the proposed method as Birgin’s method where the exact orthogonal projection onto the intersection of two convex sets is performed. Using the convergence properties, we prove that the proposed algorithm attains the optimal value or terminates in a finite number of iterations. The efficiency of the proposed method is illustrated with the numerical results on randomly generated synthetic/deterministic data and gene expression data, in comparison with other methods including the inexact primal–dual path-following interior-point method, the Newton-CG primal proximal-point algorithm, the adaptive spectral projected gradient method, and the adaptive Nesterov’s smooth method. For the gene expression data, our results are compared with the quadratic approximation for sparse inverse covariance estimation method. We show that our method outperforms the other methods in obtaining a better objective value fast.



Similar content being viewed by others
Notes
Since DSPG is always dual feasible, dinf=0; also pinf=0 if the problem is unconstrained.
References
Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)
Birgin, E.G., Martinez, J.M., Raydan, M.: Nonmonotone spectral projected gradient methods on convex sets. SIAM J. Optim. 10(4), 1196–1211 (2000)
Borwein, J., Lewis, A.S.: Convex Analysis and Nonlinear Optimization: Theory and Examples, 2nd edn. Springer, New York (2006)
Boyd, S., Vandenberghe, L.: Convex Optimization. Cambridge University Press, Cambridge (2004)
Chen, L., Sun, D.F., Toh, K.C.: An efficient inexact symmetric gauss–seidel based majorized admm for high-dimensional convex composite conic programming. Math. Program. 161(1–2), 237–270 (2017)
d’Aspremont, A., Banerjee, O., El Ghaoui, L.: First-order methods for sparse covariance selection. SIAM J. Matrix Anal. Appl. 30(1), 56–66 (2008)
Dempster, A.P.: Covariance selection. Biometrics 28, 157–175 (1972)
Duchi, J.C., Gould, S., Koller, D.: Projected subgradient methods for learning sparse gaussians. In: Proceedings of the Twenty-Fourth Conference on Uncertainty in Artificial Intelligence (2008)
Hager, W., Zhang, H.: A new active set algorithm for box constrained optimization. SIAM J. Optim. 17(2), 526–557 (2006)
Hsieh, C.-J., Sustik, M.A., Dhillon, I.S., Ravikumar, P.: QUIC: quadratic approximation for sparse inverse covariance estimation. J. Mach. Learn. Res. 15, 2911–2947 (2014)
Kim, S., Kojima, M., Mevissen, M., Yamashita, M.: Exploiting sparsity in linear and nonlinear matrix inequalities via positive semidefinite matrix completion. Math. Program. Ser. B 129(1), 33–68 (2011)
Lauritzen, S.L.: Graphical Models. The Clarendon Press/Oxford University Press, Oxford (1996)
Li, L., Toh, K.-C.: An inexact interior point method for l\(_1\)-regularized sparse covariance selection. Math. Program. Comput. 2(3–4), 291–315 (2010)
Li, P., Xiao, Y.: An efficient algorithm for sparse inverse covariance matrix estimation based on dual formulation. Comput. Stat. Data Anal. 128, 292–307 (2018)
Li, X.D., Sun, D.F., Toh, K.C.: A schur complement based semi-proximal admm for convex quadratic conic programming and extensions. Math. Program. 155(1–2), 333–373 (2016)
Li, X.D., Sun, D.F., Toh, K.C.: A block symmetric gauss–seidel decomposition theorem for convex composite quadratic programming and its applications. Math. Program. 175(1–2), 395–418 (2019)
Lu, Z.: Adaptive first-order methods for general sparse inverse covariance selection. SIAM J. Matrix Anal. Appl. 31(4), 2000–2016 (2010)
Tavakoli, R., Zhang, H.: A nonmonotone spectral projected gradient method for large-scale topology optimization problems. Numer. Algebra Control Optim. 2(2), 395–412 (2012)
Ueno, G., Tsuchiya, T.: Covariance regularization in inverse space. Q. J. R. Meteorol. Soc. 135, 1133–1156 (2009)
Wang, C.: On how to solve large-scale log-determinant optimization problems. Comput. Optim. Appl. 64, 489–511 (2016)
Wang, C., Sun, D., Toh, K.-C.: Solving log-determinant optimization problems by a Newton-CG primal proximal point algorithm. SIAM J. Optim. 20(6), 2994–3013 (2010)
Yang, J., Sun, D., Toh, K.-C.: A proximal point algorithm for log-determinant optimization with group lasso regularization. SIAM J. Optim. 23(2), 857–893 (2013)
Yuan, X.: Alternating direction method for sparse covariance models. J. Sci. Comput. 51, 261–273 (2012)
Zhang, R.Y., Fattahi, S., Sojoudi, S.: Large-scale sparse inverse covariance estimation via thresholding and max-det matrix completion. Proc. Mach. Learn. Res 80, 5766–5775 (2018)
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
M. F. was partially supported by JSPS KAKENHI (Grant No. 26330024), and by the Research Institute for Mathematical Sciences, a Joint Usage/Research Center located in Kyoto University; S. K. was supported by NRF 2017-R1A2B2005119; M. Y. was partially supported by JSPS KAKENHI (Grant No. 18K11176)
Rights and permissions
About this article
Cite this article
Nakagaki, T., Fukuda, M., Kim, S. et al. A dual spectral projected gradient method for log-determinant semidefinite problems. Comput Optim Appl 76, 33–68 (2020). https://doi.org/10.1007/s10589-020-00166-2
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-020-00166-2
Keywords
- Dual spectral projected gradient methods
- Log-determinant semidefinite programs with linear constraints
- Dual problem
- Theoretical convergence results
- Computational efficiency