Abstract
Predictive complexity is a generalization of Kolmogorov complexity. It corresponds to an “optimal” prediction strategy which gives a lower bound to ability of any algorithm to predict elements of a sequence of outcomes. A variety of types of loss functions makes it interesting to study relations between corresponding predictive complexities. Non-linear inequalities (with variable coefficients) between predictive complexity KG(x) of non-logarithmic type and Kolmogorov complexity K(x) (which is close to predictive complexity for logarithmic loss function) are the main subject of consideration in this paper. We deduce from these inequalities an asymptotic relation supx:l(x)=nK(x)/KG(x) ∼ 1/a log n, when n → ∞, where a is a constant and l(x) is the length of a sequence x. An analogous asymptotic result holds for relative complexities K(x)/l(x) and KG(x)/l(x). To obtain these inequalities we present estimates of the cardinality of all sequences of given predictive complexity.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Haussler, D., Kivinen, J., Warmuth, M.K.: Tight worst-case loss bounds for predicting with expert advice. Technical Report UCSC-CRL-94-36, University of California at Santa Cruz, revised December 1994. Short version in P. Vitányi, editor, Computational Learning Theory, Lecture Notes in Computer Science 904 (1995) 69-83 Springer Berlin
Cesa-Bianchi, N., Freund, Y., Helmbold, D.P., Haussler, D., Schapire, R.E., Warmuth, M.K.: How to use expert advice Journal of the ACM 44 (1997) 427–485
Cormen, H., Leiserson, E., Rivest, R.: Introduction to Algorithms New York: McGraw Hill 1990
Kalnishkan, Y.: General linear relations among different types of predictive complexity In Proc. 10th international Conference on Algorithmic Learning Theory-ALT’ 99 1720 (1999) of Lecture Notes in Artificial Intelligence 323–334 Springer-Verlag
Li, M., Vitányi, P.: An Introduction to Kolmogorov Complexity and Its Applications Springer New York 2nd edition 1997
Rogers, H.: Theory of recursive functions and effective computability New York: McGraw Hill 1967
Vovk, V.: Aggregating strategies In M. Fulk and J. Case, editors Proceedings of the 3rd Annual Workshop on Computational Learning Theory 371–383 San Mateo CA Morgan Kaufmann 1990
Vovk, V.: A game of prediction with expert advice J. Comput. Syst. Sci. 56 (1998) 153–173
Vovk, V., Gammerman, A.: Complexity estimation principle The Computer Journal 42 (1999) 318–322
Vovk, V., Watkins, C.J.H.C.: Universal portfolio selection Proceedings of the 11th Annual Conference on Computational Learning Theory (1998) 12–23
Zvonkin, A.K., Levin, L.A.: The complexity of finite objects and the algorithmic concepts of information and randomness Russ. Math. Surv. 25 (1970) 83–124
Yamanishi, K.: Randomized approximate aggregating strategies and their applications to prediction and discrimination, in Proceedings 8th Annual ACM Conference on Computational Learning Theory 83–90 Assoc. Comput. Mach. New York 1995
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Vyugin, M.V., V'yugin, V.V. (2001). Non-linear Inequalities between Predictive and Kolmogorov Complexities. In: Abe, N., Khardon, R., Zeugmann, T. (eds) Algorithmic Learning Theory. ALT 2001. Lecture Notes in Computer Science(), vol 2225. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45583-3_16
Download citation
DOI: https://doi.org/10.1007/3-540-45583-3_16
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42875-6
Online ISBN: 978-3-540-45583-7
eBook Packages: Springer Book Archive