InformationTheory

From WikiWorld

Jump to: navigation, search

Claude E. Shannon's, 1948 'A Mathematical Theory of Communication' is basis and substance of Information Theory.

Shannon defined a 'sender' and a 'receiver' of a 'signal' on a 'channel'.

Information is always a measure of the decrease of uncertainty at a receiver.

Specifically, Shannon defined information as the reduction in the uncertainty of the receiver about the state of the sender.

He showed that information can be measured in a discrete number of 'bits', defined as conditional probabilities of the truth value being true.

He formalized the information transfer on both a noiseless and noisy channel.

His result was equivalent to entropy (disorder, S) in thermodynamics and he called his information measure entropy. Entropy is most often considered the loss of information yielding uncertainty. An increase in certainty or order is considered negative entropy or increased information.

S = k log(W) is the entropy of a system with W possible states.

http://en.wikipedia.org/wiki/Entropy

Information plus uncertainty equals one.

In general systems theory the entropy, or 2nd Law of the LawsOfThermodynamics, says that the entropy of a system tends to increase. This leads to Eurler's formula for heat dissapation, and Schodingder's equation of quantum possibilitieties, but in quantum systems the interactions are non-linear and completely elastic and fail to progress toward equalibia dispite this tendency.

http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html


Universal information systems may be enumerated as the networks of sender-receivers logically transforming information with a delay.


Is disinformation relevant to information theory?


InformationPhysics

Personal tools