Katana VentraIP

Redundancy (information theory)

In information theory, redundancy measures the fractional difference between the entropy H(X) of an ensemble X, and its maximum possible value .[1][2] Informally, it is the amount of wasted "space" used to transmit certain data. Data compression is a way to reduce or eliminate unwanted redundancy, while forward error correction is a way of adding desired redundancy for purposes of error detection and correction when communicating over a noisy channel of limited capacity.

Minimum redundancy coding

Huffman encoding

Data compression

Hartley function

Negentropy

Source coding theorem

Overcompleteness

Reza, Fazlollah M. (1994) [1961]. An Introduction to Information Theory. New York: Dover [McGraw-Hill].  0-486-68210-2.

ISBN

(1996). Applied Cryptography: Protocols, Algorithms, and Source Code in C. New York: John Wiley & Sons, Inc. ISBN 0-471-12845-7.

Schneier, Bruce

Auffarth, B; Lopez-Sanchez, M.; Cerquides, J. (2010). "Comparison of Redundancy and Relevance Measures for Feature Selection in Tissue Classification of CT images". Advances in Data Mining. Applications and Theoretical Aspects. Springer. pp. 248–262.  10.1.1.170.1528.

CiteSeerX