Every non-zero real number is the product of two normal numbers. This follows from the general fact that every number is the product of two numbers from a set if the complement of X has measure 0.

If x is normal in base b and a ≠ 0 is a rational number, then is also normal in base b.

[17]

If is dense (for every and for all sufficiently large n, ) and are the base-b expansions of the elements of A, then the number , formed by concatenating the elements of A, is normal in base b (Copeland and Erdős 1946). From this it follows that Champernowne's number is normal in base 10 (since the set of all positive integers is obviously dense) and that the Copeland–Erdős constant is normal in base 10 (since the implies that the set of primes is dense).

prime number theorem

A sequence is normal every block of equal length appears with equal frequency. (A block of length k is a substring of length k appearing at a position in the sequence that is a multiple of k: e.g. the first length-k block in S is S[1..k], the second length-k block is S[k+1..2k], etc.) This was implicit in the work of Ziv and Lempel (1978) and made explicit in the work of Bourke, Hitchcock, and Vinodchandran (2005).

if and only if

A number is normal in base b if and only if it is simply normal in base bk for all . This follows from the previous block characterization of normality: Since the nth block of length k in its base b expansion corresponds to the nth digit in its base bk expansion, a number is simply normal in base bk if and only if blocks of length k appear in its base b expansion with equal frequency.

A number is normal if and only if it is simply normal in every base. This follows from the previous characterization of base b normality.

A number is b-normal if and only if there exists a set of positive integers where the number is simply normal in bases bm for all No finite set suffices to show that the number is b-normal.

[18]

All normal sequences are closed under finite variations: adding, removing, or changing a number of digits in any normal sequence leaves it normal. Similarly, if a finite number of digits are added to, removed from, or changed in any simply normal sequence, the new sequence is still simply normal.

finite

A finite-state gambler (a.k.a. finite-state ) is a finite-state machine over a finite alphabet , each of whose states is labelled with percentages of money to bet on each digit in . For instance, for an FSG over the binary alphabet , the current state q bets some percentage of the gambler's money on the bit 0, and the remaining fraction of the gambler's money on the bit 1. The money bet on the digit that comes next in the input (total money times percent bet) is multiplied by , and the rest of the money is lost. After the bit is read, the FSG transitions to the next state according to the input it received. A FSG d succeeds on an infinite sequence S if, starting from $1, it makes unbounded money betting on the sequence; i.e., if

where is the amount of money the gambler d has after reading the first n digits of S (see limit superior).

martingale

A finite-state compressor is a finite-state machine with output strings labelling its , including possibly the empty string. (Since one digit is read from the input sequence for each state transition, it is necessary to be able to output the empty string in order to achieve any compression at all). An information lossless finite-state compressor is a finite-state compressor whose input can be uniquely recovered from its output and final state. In other words, for a finite-state compressor C with state set Q, C is information lossless if the function , mapping the input string of C to the output string and final state of C, is 1–1. Compression techniques such as Huffman coding or Shannon–Fano coding can be implemented with ILFSCs. An ILFSC C compresses an infinite sequence S if

where is the number of digits output by C after reading the first n digits of S. The compression ratio (the limit inferior above) can always be made to equal 1 by the 1-state ILFSC that simply copies its input to the output.

state transitions

Agafonov showed an early connection between finite-state machines and normal sequences: every infinite subsequence selected from a normal sequence by a regular language is also normal. In other words, if one runs a finite-state machine on a normal sequence, where each of the finite-state machine's states are labeled either "output" or "no output", and the machine outputs the digit it reads next after entering an "output" state, but does not output the next digit after entering a "no output state", then the sequence it outputs will be normal.[19]


A deeper connection exists with finite-state gamblers (FSGs) and information lossless finite-state compressors (ILFSCs).


Schnorr and Stimm showed that no FSG can succeed on any normal sequence, and Bourke, Hitchcock and Vinodchandran showed the converse. Therefore:


Ziv and Lempel showed:


(they actually showed that the sequence's optimal compression ratio over all ILFSCs is exactly its entropy rate, a quantitative measure of its deviation from normality, which is 1 exactly when the sequence is normal). Since the LZ compression algorithm compresses asymptotically as well as any ILFSC, this means that the LZ compression algorithm can compress any non-normal sequence.[20]


These characterizations of normal sequences can be interpreted to mean that "normal" = "finite-state random"; i.e., the normal sequences are precisely those that appear random to any finite-state machine. Compare this with the algorithmically random sequences, which are those infinite sequences that appear random to any algorithm (and in fact have similar gambling and compression characterizations with Turing machines replacing finite-state machines).

Champernowne constant

De Bruijn sequence

Infinite monkey theorem

The Library of Babel

"Normal number". MathWorld.

Weisstein, Eric W.