COAP 2020 best paper prize | Computational Optimization and Applications Skip to main content
Log in

COAP 2020 best paper prize

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

References

  1. Boyd, S., Ghosh, A., Prabhakar, B., Shah, D.: Randomized gossip algorithms. IEEE Trans. Inf. Theory 14(SI), 2508–2530 (2006)

    Article  MathSciNet  Google Scholar 

  2. Can, B., Gurbuzbalaban, M., Zhu, L.: Accelerated linear convergence of stochastic momentum methods in wasserstein distances. In ICML, pages 891–901 (2019)

  3. Cauchy, A.: Méthode générale pour la résolution des systemes d’équations simultanées. Comp. Rend. Sci. Paris 25(1847), 536–538 (1847)

    Google Scholar 

  4. Ghadimi, E., Feyzmahdavian, H.R., Johansson, M.: Global convergence of the heavy-ball method for convex optimization. In ECC, pages 310–315. IEEE (2015)

  5. Gower, R.M., Richtárik, P.: Randomized iterative methods for linear systems. SIAM J. Matrix Anal. Appl. 36(4), 1660–1690 (2015)

    Article  MathSciNet  Google Scholar 

  6. Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. In NeurIPS, pages 1097–1105, (2012)

  7. Lessard, L., Recht, B., Packard, A.: Analysis and design of optimization algorithms via integral quadratic constraints. SIAM J. Optim. 26(1), 57–95 (2016)

    Article  MathSciNet  Google Scholar 

  8. Leventhal, D., Lewis, A.S.: Randomized methods for linear constraints: convergence rates and conditioning. Math. Oper. Res. 35(3), 641–654 (2010)

    Article  MathSciNet  Google Scholar 

  9. Liu, Y., Gao, Y., Yin, W.: An improved analysis of stochastic gradient descent with momentum. NeurIPS 33 (2020)

  10. Loizou, Nicolas, Richtárik, Peter: Momentum and stochastic momentum for stochastic gradient, newton, proximal point and subspace descent methods. Comput. Optim. Appl. 77(3), 653–710 (2020)

    Article  MathSciNet  Google Scholar 

  11. Loizou, N., Richtárik, P.: Revisiting randomized gossip algorithms: General framework, convergence rates and novel block and accelerated protocols. IEEE Trans. Inf. Theory (2021)

  12. Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course, vol. 87. Springer (2013)

  13. Ochs, P., Brox, T., Pock, T.: iPiasco: Inertial proximal algorithm for strongly convex optimization. J. Math. Imaging Vis. 53(2), 171–181 (2015)

    Article  MathSciNet  Google Scholar 

  14. Polyak, B.T.: Some methods of speeding up the convergence of iteration methods. USSR Comput. Math. Math. Phys. 4(5), 1–17 (1964)

    Article  Google Scholar 

  15. Polyak, B.T.: Introduction to optimization. translations series in mathematics and engineering. Optim. Softw. (1987)

  16. Richtárik, Peter, Takác, Martin: Stochastic reformulations of linear systems: algorithms and convergence theory. SIAM J. Matrix Anal. Appl. 41(2), 487–524 (2020)

    Article  MathSciNet  Google Scholar 

  17. Sebbouh, O., Gower, R.M., Defazio, A.: Almost sure convergence rates for stochastic gradient descent and stochastic heavy ball. In COLT, pages 3935–3971. PMLR (2021)

  18. Strohmer, T., Vershynin, R.: A randomized Kaczmarz algorithm with exponential convergence. J. Fourier Anal. Appl. 15(2), 262–278 (2009)

    Article  MathSciNet  Google Scholar 

  19. Sutskever, I., Martens, J., Dahl, G.E., Hinton, G.E.: On the importance of initialization and momentum in deep learning. ICML 28, 1139–1147 (2013)

    Google Scholar 

  20. Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., Erhan, D., Vanhoucke, V., Rabinovich, A.: Going deeper with convolutions. In CVPR, pages 1–9 (2015)

Download references

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

COAP 2020 best paper prize. Comput Optim Appl 80, 681–685 (2021). https://doi.org/10.1007/s10589-021-00327-x

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-021-00327-x

Navigation