Tuesday, December 17, 2013

Claude Shannon and Information Theory

Claude Shannon was merely trying to figure out how much information could transfer over a phone line, it lead to the world we live in today. Shannon's measure of randomness is the same function as Boltzmann entropy! His error-correction codes allow computers to operate! His entropy is a thermodynamic entropy as well as information entropy. This led to understanding the relationship among entropy, energy and information, which led to understanding how computer and humans think. The entropic arrow of time of an irreversible process applies whether informational or physical! It killed Maxwell's demon of perpetual motion when it consumed thermodynamics! Everything, even us, all matter and energy is subject to the law of information! Schrodinger's cat is out of the bag! The essential function of living beings is the consumption, processing, preservation and duplication of information! Information is responsible for all life on Earth! It led to understanding the mysterious molecule DNA, whose sole purpose is to store information, protect it from dissipation and duplicate it when necessary! Being alive is simply flouting entropy over a short period by preserving information. You can make a computer out of DNA, it's all linked! Communications is tied to chance where true information is unpredicatble, random events. The essence of a message is it's improbability, where entropy increases just as it does within the physics of the 2nd law of thermodynamics. Information theory consumed thermodynamics. Thermodynamics is a special case of information theory.

No comments: