At 10:14 AM +0000 2/12/08, Barker, Sean (UK) wrote:

Pat's claim "The definition of a random sequence is that no matter how

much of it you have, there is no way even in principle to compute any

information about the next item." is true only where you exclude

probabilistic estimates (which you might do depending on how you

interpret "information"). For example, if you encode the tosses of a

coin as a bit stream, as you continue to observe the bit stream, you

will be able to make increasing accurate estimates of the probability

that the next bit will be a 1. Given the additional knowledge that this

is the encoding of coin flips, you will also be able to estimate the

probability that it is a fair coin.

No, wait. A series of tosses of an* unfair* coin is not a random sequence. One gets randomness just when the actual probability of each toss being a head is 0.5 precisely.

What you say above is correct, of course, but it can be translated as: if a series of bits is not random, this can be detected with increasing accuracy as the series gets longer. Also, of course, if it is random, this can also be detected (if it were previously unknown), but that does not mean that any particular toss can be predicted.