fbpx
Wikipedia

Kolmogorov structure function

In 1973, Andrey Kolmogorov proposed a non-probabilistic approach to statistics and model selection. Let each datum be a finite binary string and a model be a finite set of binary strings. Consider model classes consisting of models of given maximal Kolmogorov complexity. The Kolmogorov structure function of an individual data string expresses the relation between the complexity level constraint on a model class and the least log-cardinality of a model in the class containing the data. The structure function determines all stochastic properties of the individual data string: for every constrained model class it determines the individual best-fitting model in the class irrespective of whether the true model is in the model class considered or not. In the classical case we talk about a set of data with a probability distribution, and the properties are those of the expectations. In contrast, here we deal with individual data strings and the properties of the individual string focused on. In this setting, a property holds with certainty rather than with high probability as in the classical case. The Kolmogorov structure function precisely quantifies the goodness-of-fit of an individual model with respect to individual data.

The Kolmogorov structure function is used in the algorithmic information theory, also known as the theory of Kolmogorov complexity, for describing the structure of a string by use of models of increasing complexity.

Kolmogorov's definition Edit

 
Kolmogorov (left) talks on the structure function (see drawing on the blackboard) in (Tallinn, 1973).

The structure function was originally proposed by Kolmogorov in 1973 at a Soviet Information Theory symposium in Tallinn, but these results were not published[1] p. 182. But the results were announced in[2] in 1974, the only written record by Kolmogorov himself. One of his last scientific statements is (translated from the original Russian by L.A. Levin):

To each constructive object corresponds a function   of a natural number k—the log of minimal cardinality of x-containing sets that allow definitions of complexity at most k. If the element x itself allows a simple definition, then the function   drops to 0 even for small k. Lacking such definition, the element is "random" in a negative sense. But it is positively "probabilistically random" only when function   having taken the value   at a relatively small  , then changes approximately as  .

— Kolmogorov, announcement cited above

Contemporary definition Edit

It is discussed in Cover and Thomas.[1] It is extensively studied in Vereshchagin and Vitányi[3] where also the main properties are resolved. The Kolmogorov structure function can be written as

 

where   is a binary string of length   with   where   is a contemplated model (set of n-length strings) for  ,   is the Kolmogorov complexity of   and   is a nonnegative integer value bounding the complexity of the contemplated  's. Clearly, this function is nonincreasing and reaches   for   where   is the required number of bits to change   into   and   is the Kolmogorov complexity of  .

The algorithmic sufficient statistic Edit

We define a set   containing   such that

 .

The function   never decreases more than a fixed independent constant below the diagonal called sufficiency line L defined by

 .

It is approached to within a constant distance by the graph of   for certain arguments (for instance, for  ). For these  's we have   and the associated model   (witness for  ) is called an optimal set for  , and its description of   bits is therefore an algorithmic sufficient statistic. We write `algorithmic' for `Kolmogorov complexity' by convention. The main properties of an algorithmic sufficient statistic are the following: If   is an algorithmic sufficient statistic for  , then

 .

That is, the two-part description of   using the model   and as data-to-model code the index of   in the enumeration of   in   bits, is as concise as the shortest one-part code of   in   bits. This can be easily seen as follows:

 ,
 
Structure functions   and minimal sufficient statistic.

using straightforward inequalities and the sufficiency property, we find that  . (For example, given  , we can describe   self-delimitingly (you can determine its end) in   bits.) Therefore, the randomness deficiency   of   in   is a constant, which means that   is a typical (random) element of S. However, there can be models   containing   that are not sufficient statistics. An algorithmic sufficient statistic   for   has the additional property, apart from being a model of best fit, that   and therefore by the Kolmogorov complexity symmetry of information (the information about   in   is about the same as the information about   in x) we have  : the algorithmic sufficient statistic   is a model of best fit that is almost completely determined by  . (  is a shortest program for  .) The algorithmic sufficient statistic associated with the least such   is called the algorithmic minimal sufficient statistic.

With respect to the picture: The MDL structure function   is explained below. The Goodness-of-fit structure function   is the least randomness deficiency (see above) of any model   for   such that  . This structure function gives the goodness-of-fit of a model   (containing x) for the string x. When it is low the model fits well, and when it is high the model doesn't fit well. If   for some   then there is a typical model   for   such that   and   is typical (random) for S. That is,   is the best-fitting model for x. For more details see[1] and especially[3] and.[4]

Selection of properties Edit

Within the constraints that the graph goes down at an angle of at least 45 degrees, that it starts at n and ends approximately at  , every graph (up to a   additive term in argument and value) is realized by the structure function of some data x and vice versa. Where the graph hits the diagonal first the argument (complexity) is that of the minimum sufficient statistic. It is incomputable to determine this place. See.[3]

Main property Edit

It is proved that at each level   of complexity the structure function allows us to select the best model   for the individual string x within a strip of   with certainty, not with great probability.[3]

The MDL variant Edit

The Minimum description length (MDL) function: The length of the minimal two-part code for x consisting of the model cost K(S) and the length of the index of x in S, in the model class of sets of given maximal Kolmogorov complexity  , the complexity of S upper bounded by  , is given by the MDL function or constrained MDL estimator:

 

where   is the total length of two-part code of x with help of model S.

Main property Edit

It is proved that at each level   of complexity the structure function allows us to select the best model S for the individual string x within a strip of   with certainty, not with great probability.[3]

Application in statistics Edit

The mathematics developed above were taken as the foundation of MDL by its inventor Jorma Rissanen.[5]

Probability models Edit

For every computable probability distribution   it can be proved[6] that

 .

For example, if   is some computable distribution on the set   of strings of length  , then each   has probability  . Kolmogorov's structure function becomes

 

where x is a binary string of length n with   where   is a contemplated model (computable probability of  -length strings) for  ,   is the Kolmogorov complexity of   and   is an integer value bounding the complexity of the contemplated  's. Clearly, this function is non-increasing and reaches   for   where c is the required number of bits to change   into   and   is the Kolmogorov complexity of  . Then  . For every complexity level   the function   is the Kolmogorov complexity version of the maximum likelihood (ML).

Main property Edit

It is proved that at each level   of complexity the structure function allows us to select the best model   for the individual string   within a strip of   with certainty, not with great probability.[3]

The MDL variant and probability models Edit

The MDL function: The length of the minimal two-part code for x consisting of the model cost K(P) and the length of  , in the model class of computable probability mass functions of given maximal Kolmogorov complexity  , the complexity of P upper bounded by  , is given by the MDL function or constrained MDL estimator:

 

where   is the total length of two-part code of x with help of model P.

Main property Edit

It is proved that at each level   of complexity the MDL function allows us to select the best model P for the individual string x within a strip of   with certainty, not with great probability.[3]

Extension to rate distortion and denoising Edit

It turns out that the approach can be extended to a theory of rate distortion of individual finite sequences and denoising of individual finite sequences[7] using Kolmogorov complexity. Experiments using real compressor programs have been carried out with success.[8] Here the assumption is that for natural data the Kolmogorov complexity is not far from the length of a compressed version using a good compressor.

References Edit

  1. ^ a b c Cover, Thomas M.; Thomas, Joy A. (1991). Elements of information theory. New York: Wiley. pp. 175–178. ISBN 978-0471062592.
  2. ^ Abstract of a talk for the Moscow Mathematical Society in Uspekhi Mat. Nauk Volume 29, Issue 4(178) in the Communications of the Moscow Mathematical Society page 155 (in the Russian edition, not translated into English)
  3. ^ a b c d e f g Vereshchagin, N.K.; Vitanyi, P.M.B. (1 December 2004). "Kolmogorov's Structure Functions and Model Selection". IEEE Transactions on Information Theory. 50 (12): 3265–3290. arXiv:cs/0204037. doi:10.1109/TIT.2004.838346.
  4. ^ Gacs, P.; Tromp, J.T.; Vitanyi, P.M.B. (2001). "Algorithmic statistics". IEEE Transactions on Information Theory. 47 (6): 2443–2463. arXiv:math/0006233. doi:10.1109/18.945257.
  5. ^ Rissanen, Jorma (2007). Information and complexity in statistical modeling (Online-Ausg. ed.). New York: Springer. ISBN 978-0-387-36610-4.
  6. ^ A.Kh. Shen, The concept of (α, β)-stochasticity in the Kolmogorov sense, and its properties, Soviet Math. Dokl., 28:1(1983), 295--299
  7. ^ Vereshchagin, Nikolai K.; Vitanyi, Paul M.B. (1 July 2010). "Rate Distortion and Denoising of Individual Data Using Kolmogorov Complexity". IEEE Transactions on Information Theory. 56 (7): 3438–3454. arXiv:cs/0411014. doi:10.1109/TIT.2010.2048491.
  8. ^ de Rooij, Steven; Vitanyi, Paul (1 March 2012). "Approximating Rate-Distortion Graphs of Individual Data: Experiments in Lossy Compression and Denoising". IEEE Transactions on Computers. 61 (3): 395–407. arXiv:cs/0609121. doi:10.1109/TC.2011.25.

Literature Edit

  • Cover, T.M.; P. Gacs; R.M. Gray (1989). "Kolmogorov's contributions to Information Theory and Algorithmic Complexity". Annals of Probability. 17 (3): 840–865. doi:10.1214/aop/1176991250. JSTOR 2244387.
  • Kolmogorov, A. N.; Uspenskii, V. A. (1 January 1987). "Algorithms and Randomness". Theory of Probability and Its Applications. 32 (3): 389–412. doi:10.1137/1132060.
  • Li, M., Vitányi, P.M.B. (2008). An introduction to Kolmogorov complexity and its applications (3rd ed.). New York: Springer. ISBN 978-0387339986.{{cite book}}: CS1 maint: multiple names: authors list (link), Especially pp. 401–431 about the Kolmogorov structure function, and pp. 613–629 about rate distortion and denoising of individual sequences.
  • Shen, A. (1 April 1999). "Discussion on Kolmogorov Complexity and Statistical Analysis". The Computer Journal. 42 (4): 340–342. doi:10.1093/comjnl/42.4.340.
  • V'yugin, V.V. (1987). "On Randomness Defect of a Finite Object Relative to Measures with Given Complexity Bounds". Theory of Probability and Its Applications. 32 (3): 508–512. doi:10.1137/1132071.
  • V'yugin, V. V. (1 April 1999). "Algorithmic Complexity and Stochastic Properties of Finite Binary Sequences". The Computer Journal. 42 (4): 294–317. doi:10.1093/comjnl/42.4.294.

kolmogorov, structure, function, entirely, different, structure, function, also, introduced, kolmogorov, kolmogorov, turbulence, theory, 1973, andrey, kolmogorov, proposed, probabilistic, approach, statistics, model, selection, each, datum, finite, binary, str. For an entirely different structure function also introduced by Kolmogorov see Kolmogorov s turbulence theory In 1973 Andrey Kolmogorov proposed a non probabilistic approach to statistics and model selection Let each datum be a finite binary string and a model be a finite set of binary strings Consider model classes consisting of models of given maximal Kolmogorov complexity The Kolmogorov structure function of an individual data string expresses the relation between the complexity level constraint on a model class and the least log cardinality of a model in the class containing the data The structure function determines all stochastic properties of the individual data string for every constrained model class it determines the individual best fitting model in the class irrespective of whether the true model is in the model class considered or not In the classical case we talk about a set of data with a probability distribution and the properties are those of the expectations In contrast here we deal with individual data strings and the properties of the individual string focused on In this setting a property holds with certainty rather than with high probability as in the classical case The Kolmogorov structure function precisely quantifies the goodness of fit of an individual model with respect to individual data The Kolmogorov structure function is used in the algorithmic information theory also known as the theory of Kolmogorov complexity for describing the structure of a string by use of models of increasing complexity Contents 1 Kolmogorov s definition 2 Contemporary definition 2 1 The algorithmic sufficient statistic 2 2 Selection of properties 2 3 Main property 3 The MDL variant 3 1 Main property 3 2 Application in statistics 4 Probability models 4 1 Main property 5 The MDL variant and probability models 5 1 Main property 6 Extension to rate distortion and denoising 7 References 8 LiteratureKolmogorov s definition Edit nbsp Kolmogorov left talks on the structure function see drawing on the blackboard in Tallinn 1973 The structure function was originally proposed by Kolmogorov in 1973 at a Soviet Information Theory symposium in Tallinn but these results were not published 1 p 182 But the results were announced in 2 in 1974 the only written record by Kolmogorov himself One of his last scientific statements is translated from the original Russian by L A Levin To each constructive object corresponds a function F x k displaystyle Phi x k nbsp of a natural number k the log of minimal cardinality of x containing sets that allow definitions of complexity at most k If the element x itself allows a simple definition then the function F displaystyle Phi nbsp drops to 0 even for small k Lacking such definition the element is random in a negative sense But it is positively probabilistically random only when function F displaystyle Phi nbsp having taken the value F 0 displaystyle Phi 0 nbsp at a relatively small k k 0 displaystyle k k 0 nbsp then changes approximately as F k F 0 k k 0 displaystyle Phi k Phi 0 k k 0 nbsp Kolmogorov announcement cited aboveContemporary definition EditIt is discussed in Cover and Thomas 1 It is extensively studied in Vereshchagin and Vitanyi 3 where also the main properties are resolved The Kolmogorov structure function can be written as h x a min S log S x S K S a displaystyle h x alpha min S log S x in S K S leq alpha nbsp where x displaystyle x nbsp is a binary string of length n displaystyle n nbsp with x S displaystyle x in S nbsp where S displaystyle S nbsp is a contemplated model set of n length strings for x displaystyle x nbsp K S displaystyle K S nbsp is the Kolmogorov complexity of S displaystyle S nbsp and a displaystyle alpha nbsp is a nonnegative integer value bounding the complexity of the contemplated S displaystyle S nbsp s Clearly this function is nonincreasing and reaches log x 0 displaystyle log x 0 nbsp for a K x c displaystyle alpha K x c nbsp where c displaystyle c nbsp is the required number of bits to change x displaystyle x nbsp into x displaystyle x nbsp and K x displaystyle K x nbsp is the Kolmogorov complexity of x displaystyle x nbsp The algorithmic sufficient statistic Edit We define a set S displaystyle S nbsp containing x displaystyle x nbsp such that K S K x S K x O 1 displaystyle K S K x S K x O 1 nbsp The function h x a displaystyle h x alpha nbsp never decreases more than a fixed independent constant below the diagonal called sufficiency line L defined by L a a K x displaystyle L alpha alpha K x nbsp It is approached to within a constant distance by the graph of h x displaystyle h x nbsp for certain arguments for instance for a K x c displaystyle alpha K x c nbsp For these a displaystyle alpha nbsp s we have a h x a K x O 1 displaystyle alpha h x alpha K x O 1 nbsp and the associated model S displaystyle S nbsp witness for h x a displaystyle h x alpha nbsp is called an optimal set for x displaystyle x nbsp and its description of K S a displaystyle K S leq alpha nbsp bits is therefore an algorithmic sufficient statistic We write algorithmic for Kolmogorov complexity by convention The main properties of an algorithmic sufficient statistic are the following If S displaystyle S nbsp is an algorithmic sufficient statistic for x displaystyle x nbsp then K S log S K x O 1 displaystyle K S log S K x O 1 nbsp That is the two part description of x displaystyle x nbsp using the model S displaystyle S nbsp and as data to model code the index of x displaystyle x nbsp in the enumeration of S displaystyle S nbsp in log S displaystyle log S nbsp bits is as concise as the shortest one part code of x displaystyle x nbsp in K x displaystyle K x nbsp bits This can be easily seen as follows K x K x S O 1 K S K x S O 1 K S log S O 1 K x O 1 displaystyle K x leq K x S O 1 leq K S K x S O 1 leq K S log S O 1 leq K x O 1 nbsp nbsp Structure functions h x a b x a l x a displaystyle h x alpha beta x alpha lambda x alpha nbsp and minimal sufficient statistic using straightforward inequalities and the sufficiency property we find that K x S log S O 1 displaystyle K x S log S O 1 nbsp For example given S x displaystyle S ni x nbsp we can describe x displaystyle x nbsp self delimitingly you can determine its end in log S O 1 displaystyle log S O 1 nbsp bits Therefore the randomness deficiency log S K x S displaystyle log S K x S nbsp of x displaystyle x nbsp in S displaystyle S nbsp is a constant which means that x displaystyle x nbsp is a typical random element of S However there can be models S displaystyle S nbsp containing x displaystyle x nbsp that are not sufficient statistics An algorithmic sufficient statistic S displaystyle S nbsp for x displaystyle x nbsp has the additional property apart from being a model of best fit that K x S K x O 1 displaystyle K x S K x O 1 nbsp and therefore by the Kolmogorov complexity symmetry of information the information about x displaystyle x nbsp in S displaystyle S nbsp is about the same as the information about S displaystyle S nbsp in x we have K S x O 1 displaystyle K S x O 1 nbsp the algorithmic sufficient statistic S displaystyle S nbsp is a model of best fit that is almost completely determined by x displaystyle x nbsp x displaystyle x nbsp is a shortest program for x displaystyle x nbsp The algorithmic sufficient statistic associated with the least such a displaystyle alpha nbsp is called the algorithmic minimal sufficient statistic With respect to the picture The MDL structure function l x a displaystyle lambda x alpha nbsp is explained below The Goodness of fit structure function b x a displaystyle beta x alpha nbsp is the least randomness deficiency see above of any model S x displaystyle S ni x nbsp for x displaystyle x nbsp such that K S a displaystyle K S leq alpha nbsp This structure function gives the goodness of fit of a model S displaystyle S nbsp containing x for the string x When it is low the model fits well and when it is high the model doesn t fit well If b x a 0 displaystyle beta x alpha 0 nbsp for some a displaystyle alpha nbsp then there is a typical model S x displaystyle S ni x nbsp for x displaystyle x nbsp such that K S a displaystyle K S leq alpha nbsp and x displaystyle x nbsp is typical random for S That is S displaystyle S nbsp is the best fitting model for x For more details see 1 and especially 3 and 4 Selection of properties Edit Within the constraints that the graph goes down at an angle of at least 45 degrees that it starts at n and ends approximately at K x displaystyle K x nbsp every graph up to a O log n displaystyle O log n nbsp additive term in argument and value is realized by the structure function of some data x and vice versa Where the graph hits the diagonal first the argument complexity is that of the minimum sufficient statistic It is incomputable to determine this place See 3 Main property Edit It is proved that at each level a displaystyle alpha nbsp of complexity the structure function allows us to select the best model S displaystyle S nbsp for the individual string x within a strip of O log n displaystyle O log n nbsp with certainty not with great probability 3 The MDL variant EditThe Minimum description length MDL function The length of the minimal two part code for x consisting of the model cost K S and the length of the index of x in S in the model class of sets of given maximal Kolmogorov complexity a displaystyle alpha nbsp the complexity of S upper bounded by a displaystyle alpha nbsp is given by the MDL function or constrained MDL estimator l x a min S L S S x K S a displaystyle lambda x alpha min S Lambda S S ni x K S leq alpha nbsp where L S log S K S K x O 1 displaystyle Lambda S log S K S geq K x O 1 nbsp is the total length of two part code of x with help of model S Main property Edit It is proved that at each level a displaystyle alpha nbsp of complexity the structure function allows us to select the best model S for the individual string x within a strip of O log n displaystyle O log n nbsp with certainty not with great probability 3 Application in statistics Edit The mathematics developed above were taken as the foundation of MDL by its inventor Jorma Rissanen 5 Probability models EditFor every computable probability distribution P displaystyle P nbsp it can be proved 6 that log P x log S O log n displaystyle log P x log S O log n nbsp For example if P displaystyle P nbsp is some computable distribution on the set S displaystyle S nbsp of strings of length n displaystyle n nbsp then each x S displaystyle x in S nbsp has probability P x exp O log n S n O 1 S displaystyle P x exp O log n S n O 1 S nbsp Kolmogorov s structure function becomes h x a min P log P x P x gt 0 K P a displaystyle h x alpha min P log P x P x gt 0 K P leq alpha nbsp where x is a binary string of length n with log P x gt 0 displaystyle log P x gt 0 nbsp where P displaystyle P nbsp is a contemplated model computable probability of n displaystyle n nbsp length strings for x displaystyle x nbsp K P displaystyle K P nbsp is the Kolmogorov complexity of P displaystyle P nbsp and a displaystyle alpha nbsp is an integer value bounding the complexity of the contemplated P displaystyle P nbsp s Clearly this function is non increasing and reaches log x 0 displaystyle log x 0 nbsp for a K x c displaystyle alpha K x c nbsp where c is the required number of bits to change x displaystyle x nbsp into x displaystyle x nbsp and K x displaystyle K x nbsp is the Kolmogorov complexity of x displaystyle x nbsp Then h x a h x a O log n displaystyle h x alpha h x alpha O log n nbsp For every complexity level a displaystyle alpha nbsp the function h x a displaystyle h x alpha nbsp is the Kolmogorov complexity version of the maximum likelihood ML Main property Edit It is proved that at each level a displaystyle alpha nbsp of complexity the structure function allows us to select the best model S displaystyle S nbsp for the individual string x displaystyle x nbsp within a strip of O log n displaystyle O log n nbsp with certainty not with great probability 3 The MDL variant and probability models EditThe MDL function The length of the minimal two part code for x consisting of the model cost K P and the length of log P x displaystyle log P x nbsp in the model class of computable probability mass functions of given maximal Kolmogorov complexity a displaystyle alpha nbsp the complexity of P upper bounded by a displaystyle alpha nbsp is given by the MDL function or constrained MDL estimator l x a min P L P P x gt 0 K P a displaystyle lambda x alpha min P Lambda P P x gt 0 K P leq alpha nbsp where L P log P x K P K x O 1 displaystyle Lambda P log P x K P geq K x O 1 nbsp is the total length of two part code of x with help of model P Main property Edit It is proved that at each level a displaystyle alpha nbsp of complexity the MDL function allows us to select the best model P for the individual string x within a strip of O log n displaystyle O log n nbsp with certainty not with great probability 3 Extension to rate distortion and denoising EditIt turns out that the approach can be extended to a theory of rate distortion of individual finite sequences and denoising of individual finite sequences 7 using Kolmogorov complexity Experiments using real compressor programs have been carried out with success 8 Here the assumption is that for natural data the Kolmogorov complexity is not far from the length of a compressed version using a good compressor References Edit a b c Cover Thomas M Thomas Joy A 1991 Elements of information theory New York Wiley pp 175 178 ISBN 978 0471062592 Abstract of a talk for the Moscow Mathematical Society in Uspekhi Mat Nauk Volume 29 Issue 4 178 in the Communications of the Moscow Mathematical Society page 155 in the Russian edition not translated into English a b c d e f g Vereshchagin N K Vitanyi P M B 1 December 2004 Kolmogorov s Structure Functions and Model Selection IEEE Transactions on Information Theory 50 12 3265 3290 arXiv cs 0204037 doi 10 1109 TIT 2004 838346 Gacs P Tromp J T Vitanyi P M B 2001 Algorithmic statistics IEEE Transactions on Information Theory 47 6 2443 2463 arXiv math 0006233 doi 10 1109 18 945257 Rissanen Jorma 2007 Information and complexity in statistical modeling Online Ausg ed New York Springer ISBN 978 0 387 36610 4 A Kh Shen The concept of a b stochasticity in the Kolmogorov sense and its properties Soviet Math Dokl 28 1 1983 295 299 Vereshchagin Nikolai K Vitanyi Paul M B 1 July 2010 Rate Distortion and Denoising of Individual Data Using Kolmogorov Complexity IEEE Transactions on Information Theory 56 7 3438 3454 arXiv cs 0411014 doi 10 1109 TIT 2010 2048491 de Rooij Steven Vitanyi Paul 1 March 2012 Approximating Rate Distortion Graphs of Individual Data Experiments in Lossy Compression and Denoising IEEE Transactions on Computers 61 3 395 407 arXiv cs 0609121 doi 10 1109 TC 2011 25 Literature EditCover T M P Gacs R M Gray 1989 Kolmogorov s contributions to Information Theory and Algorithmic Complexity Annals of Probability 17 3 840 865 doi 10 1214 aop 1176991250 JSTOR 2244387 Kolmogorov A N Uspenskii V A 1 January 1987 Algorithms and Randomness Theory of Probability and Its Applications 32 3 389 412 doi 10 1137 1132060 Li M Vitanyi P M B 2008 An introduction to Kolmogorov complexity and its applications 3rd ed New York Springer ISBN 978 0387339986 a href Template Cite book html title Template Cite book cite book a CS1 maint multiple names authors list link Especially pp 401 431 about the Kolmogorov structure function and pp 613 629 about rate distortion and denoising of individual sequences Shen A 1 April 1999 Discussion on Kolmogorov Complexity and Statistical Analysis The Computer Journal 42 4 340 342 doi 10 1093 comjnl 42 4 340 V yugin V V 1987 On Randomness Defect of a Finite Object Relative to Measures with Given Complexity Bounds Theory of Probability and Its Applications 32 3 508 512 doi 10 1137 1132071 V yugin V V 1 April 1999 Algorithmic Complexity and Stochastic Properties of Finite Binary Sequences The Computer Journal 42 4 294 317 doi 10 1093 comjnl 42 4 294 Retrieved from https en wikipedia org w index php title Kolmogorov structure function amp oldid 1178453249, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.