Abstract
The least square approach is undoubtedly one of the well known methods in the fields of statistics and related disciplines such as optimization, artificial intelligence, and data mining. The core of the traditional least square approach is to find the inverse of the product of the design matrix and its transpose. Therefore, it requires storing at least two matrixes – the design matrix and the inverse matrix of the product. In some applications, for example, high frequency financial data in the capital market and transactional data in the credit card market, the design matrix is huge and on line update is desirable. Such cases present a difficulty to the traditional matrix version of the least square approach. The reasons are from the following two aspects: (1) it is still a cumbersome task to manipulate the huge matrix; (2) it is difficult to utilize the latest information and update the estimates on the fly. Therefore, a new method is demanded. In this paper, authors applied the idea of CIO-component-wise iterative optimization and propose an algorithm to solve a least square estimate without manipulating matrix, i.e. it requires no storage for the design matrix and the inverse of the product, and furthermore it can update the estimates on the fly. Also, it is rigorously shown that the solution obtained by the algorithm is truly a least square estimate.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Lin, Y.: Success or Failure? Another look at the statistical significant test in credit scoring, Technical report, First North American National Bank (2001)
Lin, Y.: Introduction to Component-wise Iterative, Technical report, First North American National Bank (2003)
Lin, Y.: Estimation of parameters in nonlinear regressions and neural networks. Invited talk in 2nd North American New Researchers’ Meeting in Kingston, Canada during July 5 – July 8 (1995)
Lin, Y.: Feed-forward Neural Networks-Learning Algorithms, Statistical Properties, and Applications, Ph. D. dissertation, Department of mathematics, Syracuse University (1996)
Lin, Y.: Statistical Behavior of Two-Stage Learning for Feed-forward Neural Networks with a Single Hidden Layer. In: 1998’s proceeding of American Statistical Association (1998)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Lin, Y., Chen, C. (2004). Computation of Least Square Estimates Without Matrix Manipulation. In: Shi, Y., Xu, W., Chen, Z. (eds) Data Mining and Knowledge Management. CASDMKM 2004. Lecture Notes in Computer Science(), vol 3327. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30537-8_9
Download citation
DOI: https://doi.org/10.1007/978-3-540-30537-8_9
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-23987-1
Online ISBN: 978-3-540-30537-8
eBook Packages: Computer ScienceComputer Science (R0)