**Introduction**

In a previous article, I introduced a definition of a random variable that says, in simple terms, a source is ** random**, and observing that source will in turn generate a

**, if the source generates a sequence of observations that is eventually, approximately, Kolmogorov-random.**

*random variable*Note that my definition of a random variable is close enough to Per Martin-Löf’s, that it’s possible he, or someone else, already proved variations of these results –

I noticed these results in the context of my work on A.I. and physics, which are my primary areas of interest.

**The Lemmas**

For simplicity, let’s assume that observing the source generates truly Kolmogorov-random strings, since there is some nuance to the definition I offered in the previous article that adds only confusion, with no meaningful upside.

Recall that if a binary string is Kolmogorov-random, then the Kolmogorov complexity of , denoted , which is the shortest input to a UTM, , that will generate , is such that the length of , denoted , is equal to , for some constant that does not depend upon . For example, if is the length of the “print” function, then because any string can be generated on a UTM by providing the string in question as an argument to the “print” function, it follows that for all strings , .

Given a binary string , let denote the first entries of , and let denote the output of a UTM when given as input.

*Lemma 1*. If a binary string is Kolmogorov-random, for all , then there is no computable function that can generate the next entries of , given the first entries of , for all .

Proof. Assume such a function exists. It follows that, given the first entries of , namely, , we can generate the next entries on a UTM, using , specifying . Because is Kolmogorov-random, and because we can specify any integer with at most bits, it follows that , which implies that, . Note that the difference is constant, and because we can, for any fixed value of , choose an arbitrarily large value of , there is, therefore, no such function that exists for all , which completes the proof by contradiction.

*Lemma *2. If a string is Kolmogorov-random, for all , and there is some computable function that can generate an unbounded number of entries of , then is subject to the inequality below.

Proof. Assume such a function exists. It follows that can generate some unbounded number of entries of , namely , where is the index of some entry within , each listed in order. Let . Because can generate an unbounded number of entries of , we can provide an argument , such that will generate exactly entries of , in order. It follows that we can construct a substring of on a UTM, that contains entries of , in order, with any missing entries represented by some special character. Call the resultant string , and assume that . Note that . This implies that .

Note that we can generate given , the list of indexes that correspond to the missing entries within , and a string generated by concatenating the missing entries of that are within into a single string, . Because the maximum index of any entry within is , it follows that we can specify any such individual index with at most bits. Since there are entries in , it follows that we can specify all of the indexes within with at most bits.

This implies that,

.

This in turn implies that,

,

and so,

.

If such a function exists, then this is a surprising result, since it implies that you can predict an unbounded number of non-sequential observations of a random variable, albeit subject to constraints. The question is, is there another reason why such a function cannot exist?