Friday, May 6, 2016

Relation of Physical to Informational Entropy, no heat death of Universe


Shannon's entropy has units of "bits/symbol" or "entropy/symbol". Do a search on "per symbol" in Shannon's booklet.   It's parallel in physical entropy is specific entropy, not total entropy.  So to get "entropy" of a message you multiply by the number of symbols in the message:  S=N*H.  In physical entropy you multiply specific entropy times the number of moles or kg to get total entropy.

Physical entropy with Boltzmann's constant is fundamentally unitless.  Temperature is defined as the average kinetic energy per particle. Is it is a measure of the undirectional kinetic energy. Directed kinetic energy is just a moving in the same direction, so that has no heat character like undirected kinetic energies within a system.  Instead of measuring Kelvins, we could measure the average (root-mean-square) of the particles' kinetic energy in Joules, but for historical reasons we use kelivins.  Boltzmann's constant would then be Joules/Joules.  Heat energy = S*T = entropy * energy/particle. So entropy's units are fundamentally "disorder of particles" so that heat comes out to "disordered kinetic energy". Shannon's entropy S=N*H is "bit disorder". N=message length in symbols, H=disorder/symbol.  Choosing the log base is the only remaining difference and that's just a conversion factor constant.  By Landauer's limit physical entropy = N*k*ln(2) where N is in number of bits.  Solve for N = minimum number of yes/no question needed to specify the momentum and position of every particle in a system.

The entropy of every large-scale comoving volume of the universe is constant. entropy is a conserved constant.  See Weinberg's popular "The first 3 minutes".  Only in engineering is the statement "entropy always increases" for an isolated system. Feynman states it is not an accurate description.  It's just words. He gives the precise equation which does not require it to "always increase".  The problem is that there is no such thing as an isolated system in the universe. A box at a temperature emits radiation.  "The heat death of the universe" is not a physical or cosmological theory and goes against astronomical observations.  It is an engineering statement based on non-existent idealized isolated systems.
=======
follow up email

Physical entropy is S=k*log(W).  There are equations for this quantity.  Unlike information entropy, physical entropy is very precise, not subject to semantics.

Specific physical entropy can be looked up in tables for various liquids, solids, and gases at standard condition.  You multiply it by the number of moles in your beaker to get physical entropy.

The kinetic energy of a particle that contributes to temperature depends only on translational energy but that is constantly being affected by rotational or vibrational energies which are trying to be just potential energies like a flywheel and a swing.  These potential energies make heat capacity different for different materials.  If they did not exist, entropy would be a very simple calculation for all materials that depended only on the spatial arrangement of the atoms and their momentum (i.e. temperature).  Physical entropy is a difficult concept primarily because of the potential energies getting in the way.

There's something interesting about life and entropy: life may not be reducing entropy on Earth, but when we create stronger structures, the atoms are in more rigid positions which reduces the entropy. So the structures we call life have a real entropy that is lower than the non-life ingredients from which we came.  This extends to our machines: solar cells, electrical motors using metals to give electron movement, carbon fiber structures, nanotubes, steel buildings, steel cars, and CPU's all have two things in common:  1) we had to remove oxygen from ores to get materials that have 10x lower entropy per mass of their original state 2) acquiring energy (solar cells) to move matter (electrical motors) to build strong structures for protection and to last a long time with a reliable memory and to think about how to repeat this process efficiently (CPUs) is what evolution is all about.  DNA is an unbelievably strong crystal structure with very low entropy. Bones and teeth also.  However, in making these low-entropy structures, we release oxygen and CO2 and as gases, they have a lot higher entropy, so it is not clear the entropy on Earth is being reduced.  You can trace the fact that the things important to life have lower entropy so well that you have to wonder if lowering the entropy is the whole point of life. One other thing is that lower entropy, knowing what state something is in, means you have better command, control, and cooperation.

So my view is that life is the process of Earth cooling off like a snowflake forming.  A  HUGE factor in creating life on Earth is the MOON.  Isaac Asimiov talked about this.  The moon is exerting a cyclic force on the tides and mantle, and it's known interesting (non-random) things happen when a thermodynamic system is being affected by a non-random force.  For example, loosely packing objects of different shapes (or even single shapes like spheres) has a looser packing arrangement than if you shake it up as you add the pieces, or just shake at the end. Being more compact as a result of the cyclical force is reduced entropy. 

Another aspect of this (probably irrlevant) is that the moon is adding energy to the Earth. The frictional forces in tides and the mantle comes at a cost of lower and lower rotational energy in the Earth.  The moon is going to a higher obit each year, so it is receiving energy and the Earth is going a little slower. Water molecules also lose rotational energy as the snowflake forms.

The Earth's seasons caused by the tilt that came from the happenstance collision has been important to creating life.  The cyclic force of the moon has  increased the number of concentrated ore deposits that economic life depends a great deal on.  We are tapping into the lower entropy created by the moon. Our structures are less entropy, but we increase external entropy not only by the gases, but by spreading out what was concentrated ores into our machines that are all over the place, although the gravitational energies that dictate that positional entropy are small compared to the chemical bond changes.

So life appears to be no more than the physical dynamics of Earth "cooling off", just like a snowflake that results from cooling off. Net heat energy and entropy is emitted to the environment.  The Earth radiates about 17 photons in random directions for every incoming unidirectional photon, which is a lot of excess entropy being emitted, giving the opportunity of lower entropy.

You're right to ask what entropy/symbol means.  There is a large semantics barrier that engulfs and obscures Shannon's simple entropy equation. Shannon himself did not help by always calling it "entropy" without specifying he meant "specific entropy" which means is on a "per something" basis.   Specific entropy is to entropy what density is to weight.     

"entropy/symbol" means the logarithm base in Shannon's entropy equation H = sum of -p*log(p) is not being specified and it is being left up in the air as a reminder that it does not have to always be in bits, or it can mean the log base is equal to the number of unique symbols in the message. In this latter case it is called normalized entropy (per symbol) that varies from 0 to 1, which indicates a nice pure statistical measure. Using normalized entropy, any group of distinct symbols can be taken at face value and you can ask the question "how disordered per symbol is it relative to itself from 0 to 1". Otherwise entropy/symbol  varies  on how many symbols you claim the data has (is it 8 bits or 1 byte?) and on what log base you choose.    

For example, binary data has 2 symbols and if the 0's and 1's occur with equal frequency then the Shannon H (specific) "entropy" of 128 bits of data is expected to be 1 bit/symbol. Since the log base I've chosen is 2, this is also its entropy/symbol.  If you call the symbols bits, then it is  bits/bits or entropy/bit.  bits/bits = (bit variation)/(bit number of symbols) = variation/symbol = a true objective statistic.  It's total entropy is S=N*H = 128 symbols*1 bit/symbol = 128 bits or 128 entropy. Similarly "mean" and "variance" as a statistical measures have no units.

Now let's say we can't see each bit, but only see the ascii characters that are defined by groups of 8 bits.  We see 256 different types of characters and their are 128/8 = 16 of them in this particular message.  We can't see or know anything about the 1's and 0's.  But we can still calculate the entropy in bits.  But since there are only 16 of them (let's say each symbol occurs once to keep it random), the straight entropy equation that is ignorant that there might be 256 of them gives log2(16) = 4 bits/symbol.  This time it can't be called entropy/symbol without specifying bits, and the denominator is not bits either.  But it is a Shannon entropy.  The total entropy is then 4*16 = 64 bits.  In other words, the data is now encoded (compressed) because I implicitly created a look-up table with 16 symbols and I can specify which symbol out of the 16 by 4 bits each.  With only bits, my lookup table is 1 and 0 instead of 16 symbols.  At a cost of memory to create the table, I was able to shorten the representation of the message. If the log base is 16, then log16(16) = 1 = 1 character/character = 1 entropy/character.  To change log base from base 10 to any other base divide by the log of of the new base.  log16(x) = log(x)/log(16)

But since we know there might have been 256 different characters, we could have chosen log base 256 in which case the Shannon H entropy = 0.5 bytes/symbol.  This gives information entropy 0.5*16 = 8 bytes of entropy.  The minimal amount of PHYSICAL entropy in a system that would be needed to encode this is 8*k*ln(2).  See Landauer's limit.

I know the above is confusing, but the equation is simple.  I just wanted to show how semantics really messes everyone's understanding of information entropy up.  

There's an online entropy calculator that I recommend playing with to understanding information entropy if you're not going by the -p*log(p) equation and my above comments.

No comments:

Post a Comment