Agafonov showed an early connection between finite-state machines and normal sequences: every infinite subsequence selected from a normal sequence by a regular language is also normal. In other words, if one runs a finite-state machine on a normal sequence, where each of the finite-state machine's states are labeled either "output" or "no output", and the machine outputs the digit it reads next after entering an "output" state, but does not output the next digit after entering a "no output state", then the sequence it outputs will be normal.[19]
A deeper connection exists with finite-state gamblers (FSGs) and information lossless finite-state compressors (ILFSCs).
Schnorr and Stimm showed that no FSG can succeed on any normal sequence, and Bourke, Hitchcock and Vinodchandran showed the converse. Therefore:
Ziv and Lempel showed:
(they actually showed that the sequence's optimal compression ratio over all ILFSCs is exactly its entropy rate, a quantitative measure of its deviation from normality, which is 1 exactly when the sequence is normal). Since the LZ compression algorithm compresses asymptotically as well as any ILFSC, this means that the LZ compression algorithm can compress any non-normal sequence.[20]
These characterizations of normal sequences can be interpreted to mean that "normal" = "finite-state random"; i.e., the normal sequences are precisely those that appear random to any finite-state machine. Compare this with the algorithmically random sequences, which are those infinite sequences that appear random to any algorithm (and in fact have similar gambling and compression characterizations with Turing machines replacing finite-state machines).