- Entropies of large, complicated molecules are greater than those of smaller, simpler molecules (column 2).

- Entropies of ionic solids are larger when the bonds within them are weaker (columns 3 and 4).

- Entropy usually increases when a liquid or solid dissolves in a solvent.

- Entropy usually decreases when a gas dissolves in a liquid or solid.

====================

Shannon Entropy

H = + sum [ (n/N)*log2(1/(n/N)) ] = - sum[ (n/N)*log2(n/N) ] (it will come out positive)

Where "n" is the number of times a distinct symbol occurs in a message of length N symbols.

**Probability of symbols beforehand is assumed to be equal.**

**If the symbols in a message occur with equal probability in that message, Shannon entropy of that message is simply log2(number of symbols). Otherwise, the entropy is lower, assuming the same number of symbols are used**

Example messages and their Shannon Entropy:

0 or 1 or 0000 or 11111, H = 0

01 or 010101 or 00001111 or 01101010 (four 0's and four 1's), then H=1

abc or abcabcabc or aabbcc or acbbca, then H=1.58

abcd or abcd or abcdabcdabcdabcd or aaabbbcccddd, then H=2

Note that if abcd in the above is encoded with a=00, b=01, c=10, d=11 then H=1.

.

Notes:

1)

**Shannon Entropy H is bits per symbol.**

2) A message that repeats is the same H as if it was sent only once

**.**

3) Physical entropy can find parallels with

**N*H, or but it is not Shannon Entropy H.**

4) Repeating symbols in a message of the same length lowers Shannon Entropy (more surprises due to an expectation of randomness).

5) A source sending typical English will have lower shannon entropy (more surprises) due to the repetition of symbols. To cancel this fake surprise, aka to make this fake lower entropy to be more objective, we have to do something like maybe divide by the Shannon entropy we expected. Or rather, divide each n/N we encounter by the n/N we expected, our expected n/N's should add up to N. So I have H=sum(n/nexpected)*log2(nexpected/n).

H / (number of symbols in alphabet) may have some use. Might be same as H/(word length) aka metric entropy.

==========

An excellent intro to quantum entropy in a (Einstein) solid can be found here:

http://hyperphysics.phy-astr.gsu.edu/hbase/therm/einsol.html

Note: the total energy macrostate is "3" in the example, and the possible ways of getting "3" form 4 oscillators who each have equal and independent probabilities of the energy being 0 to 3 (a "q" microstate value) is

S=k*[ ln((q+N)!)-ln(q!)-ln(N!) ]

Sterling's approximation for large N and for a system q=N gives

S=k*[(q+N)*ln(q+N) - q - N - q*ln(q) + q - N*ln(N) + N ]

S= k*[ 2N*( ln(2)+ln(N) ) - 2N*ln(N) ] = k*2*ln(2)*N = 1.39*k*N

Notice that q and N here are the number of possible symbols (q+1 since zero is an energy state) AND the length of the message (N). Or vice versa, depending on terminology. Notice that q and N are mathematically the same as far as the entropy is concerned in this simple Einstein solid. They discuss q>>N but the symbols cold be reversed and the discussion would take a different perspective. HOWEVER, it does not make makes much sense for N>q because q is the sum of energies for each oscillator N. In other words, if there are 1000 possible states of energy and the average state is 500, then q=500*N. If there are 3 states, 0, 1, and 2 "Joules" with equal probability, then q=N. In this case the physical entropy is S=k*ln(2)*N.

===

thermo state can be specified by one of two cardinal functions, internal energy or entropy and they have a reference basis. U can be function of N,V, and S, and S of U,N,and V. U=sum of pi's of each Ei of the microstates. Mass, entropy, or volume added to a system will change its U. E added to a system will change N,V,S from which U can be calculated. E is added it is also equal to U increase. E can include Q.For an ideal gas:

**U=constant*e^S/(cN) * (N/V)^(R/c) * N**where c is heat capacity (J/K, i.e. dQ needed for dT, i.e. a percentage J/J since T is kinetic energy). This comes from wiki on internal energy. Rearranging:

S=1/constant * c*N*ln[U/N*(V/N)^(R/c)]

S=N/constant *[ c*ln(U/N) + R*ln(V/N) ]

This must come out positive, i.e., U and V sort of greater than N.

See wiki on Sackur-Tetrode equation.

Valid only for V/N >> 3E36. (for oxygen gas?) so maybe not really useful

R=kb*avagrado, k is J/K/moles heat per temp per particle, so R is also a "unitless" dQ/dT (J/J) like c. N thereby becomes just a count of "bit spaces" (nits). U/N and V/N are like the possiblities in 8 bits is 256 = 2^8, so log2(256) = 8. 8 bits memory can store 256 possibilities. So U/N and V/N are possibilities per memory location. N = number of memory locations (like bits). c and R are "base adjustments", getting energy and volume in the same entropy units.

U of a closed system like the Earth is dU=dQ+dW=TdS-pdV. (Q is received, W is done). Add u*dN to get general internal energy where u is potential energy for each N added.

Gibbs free energy changes do not allow S to change.

## No comments:

## Post a Comment