Wednesday, April 29, 2015

"Entropy" has different meanings

I share the following quote because it expresses something I hope to get my head around.
The measure of the amount of information which communication theory provides … is called entropy. If we want to understand this entropy of communication theory, it is best first to clear our minds of any ideas associated with the entropy of physics. Once we understand entropy as it is used in communication theory thoroughly, there is no harm in trying to relate it to the entropy of physics, but the literature indicates that some workers have never recovered from the confusion engendered by an early admixture of ideas concerning the entropies of physics and communication theory. [Italics in the original.]
from: John R. Pierce, An Introduction to Information Theory: Symbols, Signals and Noise, 2nd revised edition, 1980, Dover Publications, page 80.