Abstract
Optimization algorithms have been applied to improve the learning speed of machine learning. In the backpropagation phase, the weights of the neural network will be updated in each epoch with the purpose that the difference between the actual output and predicted one will be reached to a value smaller than a predefined number epsilon. In essence, the backpropagation process only uses the idea of optimization algorithm to formulate a weighted update formula, rather than a complete optimization problem. The enhancement of machine learning speed is meaningful when the application of artificial intelligence is rising. The cleft-overstep optimal algorithm has been introduced since the 1990s and implemented in optimal control, which has not been popularized and evaluated in other fields. This paper aims to use the idea of cleft-overstep method, also apply and program probing in linear regression analysis - a simple machine learning algorithm. Results were compared with classic gradient descent method based on the known problem and showed a significant learning speed improvement.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Ruszczyński, A.: Nonlinear Optimization, vol. 13. Princeton University Press, Princeton (2006)
Zouggar, S.T., Adla, A.: Optimization techniques for machine learning. In: Kulkarni, A.J., Satapathy, S.C. (eds.) Optimization in Machine Learning and Applications. AIS, pp. 31–50. Springer, Singapore (2020). https://doi.org/10.1007/978-981-15-0994-0_3
Ezquerro, J.A., Grau-Sánchez, M., Grau, A., et al.: On iterative methods with accelerated convergence for solving systems of nonlinear equations. J. Optim. Theory Appl. 151, 163–174 (2011)
Ruder, S.: An overview of gradient descent optimization algorithms, pp. 1–14 (2016). https://arxiv.org/abs/1609.04747
Botev, A., Lever, G., Barber, D.: Nesterov’s accelerated gradient and momentum as approximations to regularised update descent. In: Proceedings of the International Joint Conference on Neural Networks, pp. 1899–1903 (2017)
Zhang, N., Lei, D., Zhao, J.F.: An improved Adagrad gradient descent optimization algorithm. In: Proceedings Chinese Automation Congress, CAC 2018, no. 3, pp. 2359–2362. IEEE (2019)
Park, H., Lee, K.: Adaptive natural gradient method for learning of stochastic neural networks in mini-batch mode. Appl. Sci. 9, 4568 (2019)
Manh, N.V.: Diss. Doc. Sc.: Optimizing methods of uncertainty control systems. MPEI, Moscow (1999)
Manh, N.V., Tri, B.M.: Bisector method solves the unconstraines optimization problem. Math J. XV(4) (1987). (in Vietnamese)
Manh, N.V., Tri, B.M.: Method of “cleft-overstep” by perpendicular direction for solving unconstraines nonlinear optimization problem. Acta Mathematica Vietnamica 15(2), 73–83 (1990)
Manh, N.V., Tri, B.M.: An extreme algorithm based on a combination of the “cleft-overstep” principle and the principle of spatial stretching. J. Sci. Technol. (12), 1–6 (1996). (in Vietnamese)
Pal, M., Bharati, P.: Introduction to correlation and linear regression analysis. Applications of Regression Techniques, pp. 1–18. Springer, Singapore (2019). https://doi.org/10.1007/978-981-13-9314-3_1
Perron, P., Yamamoto, Y.: Pitfalls of two-step testing for changes in the error variance and coefficients of a linear regression model. Econometrics 7, 22 (2019)
Jon, D.: Appendix D matrix calculus. In: Convex Optimization & Euclidean Distance Geometry. Meboo Publishing (2018). https://ccrma.stanford.edu/~dattorro/matrixcalc.pdf
Scikit-learn (2019). https://scikit-learn.org/stable/modules/generated/sklearn.datasets.make_blobs.html
Hung, P.D., Su, N.T., Diep, V.T.: Surface classification of damaged concrete using deep convolutional neural network. Pattern Recogn. Image Anal. 29(4), 676–687 (2019). https://doi.org/10.1134/S1054661819040047
Nam, N.T, Hung, P.D.: Padding methods in convolutional sequence model: an application in Japanese handwriting recognition. In: Proceedings of the 3rd International Conference on Machine Learning and Soft Computing (ICMLSC 2019), pp. 138–142. Association for Computing Machinery, New York (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Phan, D.H., Huynh, L.D. (2020). Evaluation of the Cleft-Overstep Algorithm for Linear Regression Analysis. In: Nguyen, N.T., Hoang, B.H., Huynh, C.P., Hwang, D., Trawiński, B., Vossen, G. (eds) Computational Collective Intelligence. ICCCI 2020. Lecture Notes in Computer Science(), vol 12496. Springer, Cham. https://doi.org/10.1007/978-3-030-63007-2_31
Download citation
DOI: https://doi.org/10.1007/978-3-030-63007-2_31
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-63006-5
Online ISBN: 978-3-030-63007-2
eBook Packages: Computer ScienceComputer Science (R0)