Saturday, April 30, 2016

Major problem with constant quantity coin like bitcoin

Attempted RECAP:

If savers of a constant quantity coin loan it out like dollars (capital investment or whatever), and if it is the world's default currency like the dollar, major problems will arise if they save more than they spend.

First consider the problem if more people simply save and use the coin more in transactions, without loaning it out. The number of coins usable for transactions dries up. Even if Keynesian economics does not apply (wages and prices were not sticky) there is a problem. Or rather, wages and prices are sticky because prices and wages will have to decrease in terms of the currency. Long-term contracts could not be written in terms of the coin. Stability of prices and wages does not result from a constant-quantity currency, but from an expanding  coin (and contracting in a downturn).  This problem is also the appeal of a constant-quantity coin to savers. They get more than their initial purchase for nothing.  A deflating coin does not benefit everyone, it only means past savers get more financial power over future workers and over future savers to a degree that is GREATER than the work they provided in order to make the initial purchase.

But the situation is MUCH worse if they DID invest in society and obtain interest in return for loans.  The key problem is that loans are normally made at a rate that is greater than the productive capacity increase in society that is supposed to result from the loan.  Even loans with an expanding currency lead to an exponential increase in savings. A shilling loaned out since Jesus at 6% would be worth a block of gold the size of the solar system, minus uranus and pluto... back in 1769.

Savers of a constant quantity coin become excessively powerful over late-adopters and non-savers if they loan it out at interest. At some point, workers will abandon the coin in favor of another coin, which is, in effect, enforcing an inflating coin. Workers need an inflating coin that inflates to keep up with the coin's use, not enough to cause any inflation. But like past savers, they will be guided towards a deflating coin. Changing coins is never easy also because currency gains infrastructure and thereby monopoly status. Bitcoin has an infrastructure like the dollar that pushes workers towards adoption even if it were not deflating.  The existence of cryptocoins may not increase ability to switch, so the above problems may be unavoidable.

Once a monopoly like Google, FB, youtube, and Amazon are established, switching is not easy unless the underlying technology changes everything (Amazon replacing Walmart, Wikipedia replacing encyclopedias). The difficulty in switching is the degree to which abuse will occur. The free market promotes monopolies and is thereby not really free as a result of people needing a single standard in many things. Lack of a standard is chaotic in a bad way.

A constant-quantity coin is really a deflating coin, which adds the problem of it acquiring monopoly status without the macro-economic merit that a constant-value coin would have.

Even miners saving transaction fees is a fundamental problem as it forces value to increase beyond the amount they invested to acquire it.  A saver's unjustified gain from deflation is a late-adopter's unjustified loss.

Loaning the coin amplifies this effect. Debtor's will be pushed by loaners into the coin for this reason, the same way the IMF and World Bank pushed 3rd world countries into loans based on an external currency (they pushed the dollar in Latin America and Asia, and the Euro in Greece, Latvia, Iceland, and Ireland). As those economies collapse(d), they do (did) not have the ability to reduce the quantity of the coin in which the loans were made, which is a type of default.  If they had been allowed to take the loans in their own currency, the loaners would have made DARN sure the society would succeed as a result of the loans, instead of pushing them into selling off natural and tax-payer-built infrastructure. Loans should be like stock investments where the loaner has a vested interest in the success of the debtor, preventing the debtor from becoming enslaved. A contract with such one-sided terms should not be legal (heads you and I win, tails I win and you lose). A free market with these types of loans with no government oversight has a "benefit" and justice served: population reduction where the loaners will eventually have killed the enslaved and then have to start working themselves.

Bitcoin could be used mainly as a store of value.  For example, banks and rich people could choose it for transfers without increasing its use in market pace transactions. Miners would sell profits as they are gained for a currency that expands and contracts with society's productive capacity and thereby has a stable value for wages and purchases. This still results in early adopters unjustly gaining at the expense of late adopters.

Here are some related tweets
Like every other holder, I hold bitcoin because I want to be in the ruling class, getting something for nothing.

Wanting to get something for nothing is a ruling class moral.

If BTC succeeds, we will be to that extent. We gain at later-adopters' expense, getting more of world's pie w/o contributing to it.

Not like every other holder. That's your reasoning but it's not universal.

 Banks can't adopt a public, open, borderless network like bitcoin. The regulations prevent them from doing so
They can buy it up before the masses, restricting access to it, driving it up, driving up control.

 If the banks adopt faster than the masses, change is questionable. They did it in the late 1800's with gold.

Tuur Demeester
I don't hold BTC to be in the ruling class — I hold it because I want to be free...

Your freedom comes at expense of late-adopters' financial freedom. Constant-quantity coin is ultimate banker CTRL of poor.

Great irony = BTC holders imagining moral or macro-economic benefit, but I shouldn't ascribe selfish motives to selfish outcomes.

World's problem is inefficiency of muscle, brain, bone, and photosynthesis. Solution: machines displace biology. It's accelerating.

 Solar cells 30x more efficient than plants. Doubling fast. Using brains in future = like using shovels to dig. Not good end 4 DNA.

How are coin coders/miners/holders different from gov/banks/voters? No difference?

Miners/coders/holders prop each other up at expense of new entrants. Ladder-up=holding coin & loaning it out @ 5% BTC

 Constant currency may work in Germany, but not most places using anglo-dutch lending.

1) It must be at least mixed as transactns will be miners' fees. No trans=no protection/value. Making more transactions possible...

2) is a bad thing because that allows it to be used for wages & prices instead of large asset transfer. If wages & prices made ..

3)  in BTC then it becomes entrenched & usable as a weapon against non-wealthy. Or rather, loans in it should not exceed debtors...

4) assets, at least on macro scale so that debtors are not in a required downward spiral unless ruining debtors is a goal. Loans

5) should be an at-risk partnership like a stock where creditor does not succeed unless debtor succeeds. Destroying debtor at ...

6)creditor's gain wrongly makes loans diff from stocks: heads creditor wins, tails creditor wins = anglo/dutch loans,not in Germny.

 It does nothing fundamentally different. Merchants-workers-customers transacting in it will run afoul of coders/miners/holders..

1st group wants stability or inflation to fight against the "bankers" trying to make it deflate.

holders="immoral" on accident if it becomes default transaction currency instead of mainly large transferr payments like gold.

Once mining stops, If you do not pay your taxes (fees), you're isolated from transactions.~same

 A world BTC w/ infrastructure-based-monopoly on currency & held by banks loaning @ % = unmitigated disaster due to fix-quantity.

They would continue to gain BTC, driving its value up, needing to loan out less & less = parasites choking off economic activity

This is called "the magic of compound interest" applied to a fixed quantity coin, which is worse than

Switch just as hard as now if govs required tax payment in BTC (banking/Satoshi lobbies) & 90% merchants & apps required it.

$ is dominant not by force but like youtube, VHS, Google, M$ etc monopolies: society needs a single infrastructure=bitcoin danger

So my logic applies only if "bitcoin everywhere". Switching could prevent evil, but world stuck on $ did not prevent evil.

Youtube relevance = infrastructure = BTC relevance. World not forced on $ except by financial system infrastructure.

comment on an economic book review, Michael Hudson's "Killing The Host"

This entire book is explaining how Marx was clueless when it came to the dangers of finance capital displacing industrial capital. As Hudson explains, Marx was far too optimistic about capitalism. Marx thought banks would make loans to build capital infrastructure, not to merely to gain access to iPhone patents nor to buy 4 years of work from 2 moderately-intelligent people for $1 billion dollars (Google, FB, Snapchat, Whatsapp, Youtube) merely as a way to gain market-share eyes to advance monopolies on search, social connections, messaging, and  videos.  But the worst of the anglo-dutch loan situation is the loaning to drive up asset prices that stifles worker cost-efficiency and diverting what could have been tax revenue instead of bank profit, and the loans to take over infrastructure in order to sell it off, making a profit for the hawks and banks mainly by nullifying retiree plans and capitalizing on adverse stock market fluctuations. Notice there is no illegality in any of this detrimental behavior. There is only an absence of a functioning government that could stop macro activity that hurts the strength of us all. Good government enables and advances system-wide profit from cooperative behavior (prisoner's dilemma solution) at a level above the individual transactions. Individual transactions have zero concern about the system-wide effects of their transactions. A functioning healthy body results from the development of a governing brain over the system that keeps the cells from acting merely for highest profit (cancer).

Notice that Veblen (the only author Einstein liked as much as Bertrand Russell) and Marx's home country (Germany) is the one dominating the EU production world by NOT following anglo-dutch debtor-as-slave (heads bank wins, tails bank win) which is very different from stock investments where "creditor" (investor) depends on success of debtor.**  Or rather, regulations from GOVERNMENT (oh, the horror) on loans in Germany enabled Marx's capitalistic axioms to actually apply, propelling Germany to recurrent greatness without the fallout Marx thought would occur. How close is Germany to what Marx desired?  "Communistic" oligarchies like old Russia and old China from 1930's to 1989 (not the new ones) are not "Marxism" in action, but strong democracies are not far off as I'll explain.

**Except German attitude towards Greece where German banks want to follow anglo-dutch rules.

But as far as a world-order goes, this reviewer might be right: Hudson's godfather was Trotsky, who differed from Lenin in his Marxism by saying the world instead of individual countries should defend itself against capitalistic lobbying of government. Marxism was at its core an uprising against government from worker consciousness. How ironic "free market" capitalists also want an overthrowing, except they want some purely theoretical form of cooperative anarchy instead of "for the people" socialism.  We need more good governing and less bad governing, not more anarchy.  In the 3rd world there is not enough tax income to create a functioning government. It results in a very good model of anarchy. It can improve if outsiders allow them to engage in equitable trade, but only by getting together and forming a functioning democratic government ("capitalistic socialism").   Oligarchies are often the result of a democracy that failed to be as socialistic as the oligarchs promised. Not kicking private companies out is compatible with socialism. "For the people, by the people" is a socialist slogan, not just an American one. Venezuela is a democracy, pretending to have socialism when it has little.  Price controls (Nixon) and oligarchies (Venezuela) are not socialism, but they did occur in democracies (or "Republic" for the unimaginative). When a democratic vote turns into an oligarchy people wrongly yell "socialism". They should yell "democracy". The goal of lobbyists is to turn a democracy away from socialism into legalized capitalistic monopolies having a free-for-all via bad laws. Is it the government's fault the lobbyists are in the way, or the voters?  Will those former voters function better in anarchy?  

Socialism in the U.S. is being subverted by democratic voting that is allowing capitalism to lobby the government.

The entire reason democracy exists is so that it can bias capitalism in favor of socialism by giving everyone a single vote, subverting the non-equal money in capitalism, giving the poor future opportunity.Workers do not own all the shares like Marx wanted, but they can tax away ridiculous profits that were unfairly gained from capitalizing on society's need for a single-Search, single-social site, and dominant messaging apps. They succeed not by work, intelligence, or moral superiority, but by merely being the first that was good enough for society to bless with its need for a single-player monopoly.

Saturday, April 16, 2016

Bits are most efficient, entropy/energy => Zipf's law, new memory systems have a solvable problem

This will need a strong interest in Shannon entropy, Landauer's limit, and Zipf's law.

Consider a message of N symbols composed of n unique symbols, and the unique symbols have an energy cost that it is a linear function. For example, a brick at 10 different heights on an inclined plane. To be distinguishable states from a distance, they have to be some fixed minimal distance apart. If the inclined plane is not even with the ground, but raised a distance "b" and the vertical distance between distinguishable states is "a", then energy required for each symbol is E=a*i+b where i=1 to n assigned to the n unique symbols. 

It will cost a lot of energy to use each symbol with equal probability, but that is what gives the most information (entropy).  What is the tradeoff? What function of "i" usage will result in the highest information per energy required? Underlying all this is the energy required to change a bit state, E=kb*T*ln(2), Landauer's limit. "a" and "b" in my linear E equation above have this E as a constant in them. a+b>kb*T*ln(2)which I'll discuss later.

The Zipf law (Here's a good overview) is a good solution in many energy-increasing profiles of the symbols. Zipf-mandelbrot law is Ci =  C1*p/(i+q)^d where p=1 and q=0 for this discussion because p will cancel and q, at best, will make 100% efficiency possible when 80% is possible without it, so I'll not investigate it except to say using d=q=5x the d's I gets 100% in many cases.  

So I'll investigate this form of Zipf's law:
Ci =  C1/i^d
where Ci is the count of symbol "i" out of n unique symbols occurring in a message of N total symbols, C1 is the count of the most-used symbol, "i" is the ranking of the symbol, and what d values are optimal is what all this is about. Note: max entropy (information) per symbol occurs when d=0 which means every symbol occurs with same probability. [update: I found a paper that compares normalized entropy (inherent disorder from 0 to 1 without regard to n or N) to Zipf's law. The most interesting text, half way between order and disorder with normalized entropy=0.5 is when d=1 for very large n, greater 50,000. Shakespeare has d=1 verses others of only d=0.8. Language approaches this, or well exceeds it if you consider word pairs as symbols, and we do indeed remember them in pairs or more. For small set of symbols, n=100, n=100 to get H=0.5. Maybe the word-types diagrammed could be a set this small and follow this rule if the writer is smarter. ]

Note: using a=1, it turns out my efficiency equation below has 100% as its max. 

There is an ideal Zipf factor (d) that determines the way computers should use symbols in order to give max bits/energy if they are going to shift towards memristor-type memories and computation that uses more than 2 bits. The d will increase from our current ideal of 0 as the energy baseline b divided by the number of unique symbols decreases as computers get better. 

Heat generated by computation is always the limitation in chips having to synchronize with a clock because at high clock speeds, the light signal can only travel so far before it is no longer letting distance parts be in sync. A 10 Ghz square wave that needs everyone synced up in 2/3 of its cycle can only travel 2 cm down the twisted wiring inside a 1 cm chip at the speed of light. This means the chip is limited in size. As you pack more and more in, the heat generated starts throwing low energy signals out of where they are supposed to be. Using higher energy signals with higher voltage just makes it hotter even faster.

From this, one might not ever want to consider going to more than 2 symbols because it is going to require higher and higher energy states. That's generally true. The 66% my equation gives for bits can be made 80% even when using more than 2 symbols by letting d=2.4, if it is a very efficient idealized computer. 

After numerical experimentation with the equation below, I've found (drum roll please):

d=0 if b greater than 10*n and n greater than 2 (modern computers)
d=0.6 if b=n
d=1 if b=n/3 and n greater than 9 (language)
d=1.5 if b=10 and n=b^2.
d=2.4-2.7 if b less than 1, n <1 and="" n="" nbsp="">100 to 30,000 (idealized computer)

An equation for d as a function of b and n could be developed, but seeing the above was good enough for me.

Bits are by far the most energy-efficient (bits information / joule energy cost) when maximizing the entropy per N symbol (i.e., when you require the symbols to occur with equal frequency).

d=1 does not, by far, use equal frequency of symbols which is what is required for maximal entropy per symbol used in a message of N symbols. This is partly why languages are only 9.8 bits per word out of a possible 13.3 (Shannon's entropy H=13.3 bits per word if words were used with equal frequency because 2^13.3 = 10,000 words)

New ideal memory systems having more symbols per memory slot and operating at the lowest theoretical energy levels (b=0) under this model (E=a*i+b) will be 0.06*n times less efficient if they use the memory states (symbols) with equal frequency (which is max entropy per symbol) than if they follow the rule of d=2.4.  For example, if you have 100 possible states per memory slot, you will be 6x less efficient if you use the states with equal frequency. For larger b of systems in the near future with b=10*a and n=b^2, d=1.5 is 2x more efficient than d=0.

Side note:  Here is a paper that assumes it is logarithmic instead of linear. It may be an exponential function is many situations: words rarely used may be hard to think of and understand, so it may be more efficient to use 2 well-known words that have the same meaning. Conversely, trying to put together two common words to describe one nearly-common word can be more costly.

The a+b is also the minimal energy a "0" state on a computer needs to be at above background temperature energy fluctuations in order to have a VERY small probability of accidentally being in that state.   The probability of a thermal energy state is e^(-(a+b)/kT), so a+b needs to be a lot greater than kT, which modern computers are getting close to having problems with. Distinct symbols contained in same memory slot would have to be to be energies above this. Computers use only 2 states, 0 and1, but I am interested in language with words as the symbols. We require different amounts of energy when speaking and in deciding the next word to speak.  The energy between each of these separate symbols could be less and less and maintain the same reliability (logarithmic instead of linear), but if the reader of such states has a fixed ability to distinguish energy levels, then there is an "a" slope needed, having units of energy per symbol.

What is the most efficient way for the sender to send a message of x bits? "Counti" below is the count of unique symbol i.

Actually the units are joules/joules, because as I said before "a" and "b" have a k*T*ln(2) constant in them based on Landauer's limit and how reliable the symbols should be. 

Also, it's not H=1 at the end but H=log2(n), so my last line should be Efficiency=c*log2(n)/n^2 when the counts are all equal, but the point remains: from the last equation, bits are the most energy-efficient way of sending information if you are requiring the symbols to occur with equal frequency, which is normally the case if you want the most information per symbol. But intuition already knows this:  it's no surprise at all that using the 2 symbols that require the least amount of energy will be the most energy-efficient at sending entropy. n can't be 1 because entropy = 0 for that limit.

This does not mean using only the 2 lowest-energy symbols will be the most energy-efficient method of sending information. Remember, the previous paragraph requires the symbols to be used with equal frequency. It's a good assumption these days in wasteful computers, but it is not the most energy-efficient method when the computer is efficient enough to be affected by the energy cost function of the symbols. On less-frequent occasions, you could use the energy-expensive symbols to "surprise" the receiver which carries a lot more information. To find the optimal usage, the efficiency equation above must be optimized by experimentation because it's easy to analytically solve only when H = 1 (or log2(n)). doing that resulted in my 4 claims above.

Here is a 2015 paper that professionally does what I am going to do, but they just look at logarithmi energy functions which I agree might be better for things like cities, but for signal generators, I think the distinct energy levels between state is an important category. 

The rest of this post are my older notes on this. It will ramble. Most of it is trying to see how an energy cost results in Zipf-like distributions.

Zipf's law states that 1/rank is roughly the number of people in a city times the population of the largest city. It seems to usually be based on a feedback effect where efficiency is gained by more "people" joining causes an already-efficient thing (like a city, or buyers of a product or service) to become more efficient. Mandelbrot increased its accuracy by using c1/(rank+c2)^c3.  I'm going to use to find the best constants for different energy functions that result in the highest entropy.  How does a set of cities correspond to seeking highest entropy/efficiency?  I might have the sequence backwards: it might be that since people are potentially randomly distributed (high entropy) that the only thing guiding their selection of city is based on efficiency. There's not one mega city because its success in some ways causes its failure in other ways. It can be a fractal pattern from seeking the more efficient nearby options, like side sparks on lightening bolts and blood vessel routes being established, if not brain wiring. Current in wires will seek other routes as highly efficient (low resistance) routes become too heated from too much success. After defining symbols to represent these situations, it will be found to follow some zipf-mandelbrot law. But I want to derive it. Mandelbrot talked about fractals and the erngy cost also.   We have market dominators like Walmart and Amazon who do a good job because lots of people support them, and lots of people support them because they did a good job. So it's a feedback. It's been said China and Soviet union cities, at least in the past, did not follow the rule, apparently because aggressive socialism interfered with market forces.  Others tried to dismiss it as just a statistical effect, but those efforts were never convincing and are out of favor. 

In terms of word frequency, there can be feedback.  "a" is an efficient word. A lot of words like to use it, probably because a lot of other words like it. By everyone using it, it might have become very price for accuracy or general. It might have been said less efficient in the past, but it then got used more which made it more efficient. My point is to mention this feedback before carrying on with relating the resulting energy efficiency with frequency of occurrence.  It also shows my efficiency equation might need to be logarithmic, but unlike others, that is not my starting point. 

If we find a little profit from easy action, the language of our action will have a high count for that action despite the small profit. And if actions of an ever-increasing amount of energy can be combined in a clever way (high entropy), we can get a lot more energy from the world by carefully selecting how often we take the small actions compared to the energy-intensive actions. But by making these actions interdependent to achieve a larger goal (including the order in which they occur) then a lot more "heat" can be generated.  "Heat" needs to be used imaginatively here to apply, but it does apply in a real way. 

If we have a bunch of levers to control something, we will make sure the levers we use the most will have the lowest required energy for action. Short words are easier to say, so we give them the definitions we need to use the most. The mechanics of speech and hearing do not allow for us to choose the next syllable out of every possible syllable with equal ease.  The most frequently used words, syllables, or whatever sequences are the ones that require the least amount of food energy.  We've assigned them to the things we need to say the most. But that's probably not exactly right: we say things in a way that allows use to use the easy words more frequently.

I tried 100 symbols in excel with a=1 and b=0.  Result: H=1 (d=0) is half as efficient (13%) as countsi=1/i which had 27% efficiency.  Value of "a" does not matter because it kills both efficiencies.  I tried d/(e*n)^c as a count per symbol factor, and 2.6 for c was best at 80%. The d and e had no effect. 1/(n+d)^c got it higher and higher to finally 0.99 as both d and c rose above 6 with n=100. It might needed to be higher with higher n.  If a of the linear function changed, the 80%increased by 1/a.  With significant b, c wanted to come down to 1.  And the d and c going up did not help much and needed to be lower.  At a=1 and b=10, c=1.5 was good.  Higher b=100 needed to be closer to 1, but 0.5 to 2 was OK, no so closer to the law for the cities.

In conclusion, the frequency=1/rank^(<1 accurate="" b="" be="" for="" nbsp="" rule="" should="">>a which is probably normally the case (speaking requires more energy than choosing what to say), if we are selecting words based on the amount of energy requiring in choosing words.  Trying to determine the energy function is a fuzzy thing: it includes how hard it is to say some words after others and how good you can recall words. If this later effect is large, great variability between writers might be seen.  When an author sees little difficulty in using a wide variety of words, the energy difference will be less and he will be closer to use words more evenly with a smaller power factor in 1/r^c.  c=0 is the same as saying H=1 (aka H=log2(n)).  Reducing n did not have a big effect. I tested mostly on n=100.  I see no reason to think that this idea is not sufficient or that it is less effective than other views.

Saturday, April 9, 2016

Cosmology, entropy, evolution, rise of the machines: amazon comment.

I have since found out that all standard models of the big bang require entropy to be constant on an expanding volume basis of the Universe (see Stephen Weinberg "The First Three Minutes") and is more straight forward for cosmology than looking at mass-energy conservation.  Entropy is conserved like mass and energy in cosmology.  Fixed volumes in "empty" space are decreasing in entropy/volume in proportion to the volume expansion. Gravitational systems are emitting entropy. This decreases the amount of entropy they contain, like a black hole emitting hawking radiation and thereby decreasing in surface area in the same proportion. The thermodynamic 2nd law in the non-exact verbal form (see Feynman's lecture on thermodynamics) is "entropy always increases for isolated systems".  But there is no such thing as an isolated system (see wiki's heat death of universe article).  It's just an engineering ideal to make approximations.  So the 1800's 2nd law, which was declared by some to be the most rigid law in all of physics, has been quietly overthrown by cosmology. There is such a religious belief in it that people studying evolution can't believe life is "causing" an increase in order on Earth (it's not "life" causing anything, it's the dynamics of matter that is causing entropy to be emitted to the universe which we perceive as life).  An example of the order-creation is when we remove O2 from ores to create metals and silicon. Look at the specific entropy/mole of SiO2 verses Si, or Fe3O2 verse Fe solids.  The resulting structures are stronger bonds with much less entropy (tighter bonds are less entropy because atom movements are more tightly controlled which means fewer quantum momentum*position states). By having less entropy these structures are in a tighter state that allows for more control, survivability, and faithful reproduction.  It also happens to allow for solar cells that are 20x more efficient per area than photosynthesis, motors that are 200x more cost-efficient than muscle, and CPUs that can control electrons for thinking that weigh 40,000 times less than the ions brains have to use (DNA by itself was not able to smelt metals, so it had to use ions instead of electrons). SD cards hold 10 MBytes on the silicon chip in a volume the size of a red blood cell. Neurons are dead meat in economic reality; if you can't program or use computers in a way that replaces the usefulness and expense of other brains, you are nearly economically irrelevant already.

Thinking (CPUs) to figure out how to move matter (motors) with energy (solar cells) and proper programming are all that's needed for the reproduction of the thinking device. Muscles and bones are the way brains make copies of themselves. Buildings, motors, and solar cells are the way computers make copies of themselves. In order for these things to work together in a system, each structure needs to have low entropy (high order) and be connected in a low entropy way so that each pieces state and their relation is known and controllable. And yet, in the CPU itself, the bits represent an enormous complexity information. This increases the physical entropy, but it is currently still an incredibility small amount compared the lowered physical entropy reductions of our machines compared to their natural state of ores.  CPUs still need people to get programmed, but they are "trying" to get people out of the economic picture via corporate control of actors (professional emotion manipulators) in governments, subverting democracy. The machines are so good at reducing entropy, they can still give people great wealth (money printing, legal rights) with plenty of room to spare for their expansion. Again, it's just mindless physics at work, not really anyone pulling the strings: not people, not corporations or bankers, not CPUs. It's cosmology and physics. Minds have no primary force. It's something that's hard for me to imagine, but even harder for me to avoid as a logical consequence of the physics, and it resolves some questions I had concerning evolution. All forces moving matter come from the expansion of the universe in time and space. We do not influence it. We are part of its expression.

For these reasons "life" is the result of physical principles that demand a reduction in entropy. Genes are the memory of what has happened. They have no internal force they can exert on the environment.  Only the environment has potential energy gradients that cause forces that give genes the power (watts) to act.  Genes are the most efficient route of enzymatic reactions that the environment has stored as a memory in low-entropy DNA crystals.  Life is the result of the cosmological principle of constant entropy per expanding volume of the universe, not its own cause. Not even a force in its future. It's an idea I can't hold for long, but something demanded by my view of the physics.

I suspect the perception that one is distinct from the rest of the Universe is why we perceive a clock and the expansion. I suspect an accurate viewpoint would see everything and nothing at the same time and place.

I need to clarify one point: I have not been able to convince myself entropy on Earth is decreasing as a whole.  Converting SiO2 to O2 and Si is not a net reduction in entropy on a per mass basis, but about break even, because O2 as a gas is high entropy/mole, offsetting the gains. The point is that we need the Si in our reproducing economic structures. However, the Gibbs free energy on Earth increases (energy can be gained from Si+O2 => SiO2) as the waste heat and excess entropy from the inefficiencies in SiO2=>S + O2 production "engines" is sloughed off to the universe. so as Schrodinger later corrected his "What is Life?" booklet let me correct here: life increases Gibbs free energy on Earth, not exactly cause a decrease entropy as a whole.  It is the structures we call life that have lower entropy that those atoms had in their natural state.

Mass reduction via entropy release decreases the curvature of space-time, so gravity's "twisting up" of space-time seems to store entropy by some direct relationship. Since the units of speed are meters*i/meters = unitless, speed is a questionable concept.  Especially since every observer moving at any speed always observes all photons in the universe traveling the speed of light, there is no such thing as being "near the speed of light" except by other observer's viewpoint. In other words, "speed" has serious problems, and it's required for relativistic calculations. I have thought about instead of allowing speed to exist in physics, we let the speed of light change. In relativistic equations you have speed/c ratios. For speed to be constant which forces c to change. All the math stays the same, just our perception changes. Instead of requiring photons to change energy relative to different observers, lets just make speed a constant in all frames of reference and make all photons keep their assigned energy. The only way to do this is to let c change.  c by the way is a speed.  So by the relativistic idea of distance being equal to time in 4D space-time, relativity identifies a fundamental flaw in its own precepts: speeds like c have no units. It can't be a physical constant like other constant. Does that mean it can or should be variable? So a changing c assigned to different space-time volumes could be the varying quantity that produces entropy. Minimal entropy might occur when c does not vary in a space-time volume, and max entropy occurs when there is the maximal variation. Entropy of a black hole is the max per space-time volume that is possible.   Don't ask me how charge and spin might fit in.

Thursday, April 7, 2016

Morality of physics & machines

In a godless world where we have no justification for selfishness for ourselves or our species (other than "it feels good" or "because we want to") I have to wonder if physics has something to say on the subject of "what is right". Of course, being godless, it can't be a moral "right", but rather the question is "in what direction is physical law heading". To my shock, "entropy always increases" is known to be a patently wrong statement of the 2nd law. It applies to only isolated systems and this does not exist in the universe. The universe has constant entropy on a comoving expanding volume basis, and entropy is conserved (See Weinberg's "First 3 Minutes").  Since entropy is conserved and constant on a comoving basis, it has to be decreasing on a fixed-volume basis like solar systems and galaxies.  It seems Earth must emit more entropy than it takes in because of these two laws.  Entropy on Earth seems to be decreasing "so that the universe can expand". I could not calculate an actual decrease from observable data because the structures we use in economics are releasing entropy as gases even as the structures are measurably and greatly a decrease in entropy. Economics is removing O2 from C, Si, metals, etc.  The resulting structures are always intensely useful to economics, which is an explosive evolution.  These stronger bonds mean more restricted states in the atoms' positions which is lower entropy. This means they are easier to control, easier to use for exerting control, stronger bonds, and better at turning light into energy. Being controlled and exerting control means motors and CPUs. The process is taking a recursive path which we call evolution.  So, whatever morals people have, be they humanistic godless atheists or Muslims, in one way or another they are fighting out the battle to find the best way to lower entropy on Earth.  Muslims seem incredibly bad at it because they hate the machines and want babies.Of course godless atheists have no desire to pollute the planet with more children and hatred, so they concentrate on creating the machines and teaching them how to work more cooperatively (protocols) and survivably (peer-to-peer). People aren't not directing the money wisely (for the lowering of entropy), so there's bitcoin.  Laws, banks, and government are defunct, so there's Ethereum. If love means peace, working together, and no more pain, machines are worthy successors in this lowering of entropy.

Tuesday, April 5, 2016

Space ship travel at 1 g acceleration

Concerning the "traveler's speed" in this, there are some who argue there is a problem with "exceeding the speed of light" by a factor of 1,400 times (for example, as the chart shows) when travelling to the center of the galaxy like this. They will say "nothing can go faster than the speed of light", but this is a problematic statement because all observers from all frames of reference observe all photons as always going a speed of c, no matter how fast the observers are moving. What they mean is that no observer can observe any traveler going faster than c. All observers go zero meters/second relative to all photons. So, the chart shows the traveler's speed as the distance he covered as measured by him before and after taking the trip divided by the time it took him according to his clock, the same as everyone does when they are driving a car.

Friday, April 1, 2016

relation between physical and information entropy

The conversion factor from physical entropy to information entropy (in random bits) uses Landauer's limit: (physical entropy)=(information bits)*kb*ln(2). The number of yes/no questions that have to be asked to determine which state a physical system is in is equal to Shannon's entropy in bits, but not Shannon's intensive, specific entropy H, but his extensive, total entropy of a data-generating source: S=N*H where H=1 if the n bits are mutually independent.

Landauer's limit states that 1 bit of information irreversibly changing state releases entropy kb*ln(2), which is a heat energy reelase for a given T:  Q=T*kb*S, implying there was a stored potential energy that was the bit.  This shows that entropy is  information entropy:  the ln(2) converts from ln() to log2(). kb is a simple conversion factor from average kinetic energy per particle (definition of temperature) to heat joules which has units of joules/joules, i.e. unitless. If our T was defined in terms of  joules of kinetic energy (average 1/2 mv^2 of the particles) instead of Kelvins, then kb=1.  So kb is unitless joules/joules. It's not a fundamental constant like h. c also does not have fundamental units if you accept time=i*distance as Einstein mentioned in appendix 2 of his book, allowing use of the simpler Euclidean space instead of Minkoswki space without error or qualification and in keeping with Occam's razor.

Shannon's "entropy" (specific, intensive) is H=sum(-p*log(p)) and he stated 13 times in his paper that H has units of bits, entropy, or information PER SYMBOL, not bits (total entropy) as most people assume. An information source generates entropy S=N*H where N is the number of symbols emitted. H is a "specific entropy" based on the probability of  "n" unique symbols out of N total symbols. H is not a "total entropy" as is usually believed, finding its physical parallel with So=entropy/mole. Physical S=N*So and information S=N*H. It is rare to find texts that explain this.

An ideal monoatomic gas (Sackur-Tetrode equation) has an entropy from N mutually independent gas particles of S=kb*sum(ln(total states/i^(5/2)) where the sum is over i=1 to N. This is approximated by Stirling's formula to be S=kb*N*[ln(states/particle)+5/2]. I can't derive that from Shannon's total entropy S=N*H even though I showed in the first paragraph the final entropies are exactly the same. You can't directly correlate an informatic "symbol" (such as the particles in a gas or the phonons in a solid) in a physical system to the symbols in Shannon entropy because physical entropy is constrained by total energy that can be distributed among fewer than the N available particles or phonons. Once you've characterized a source of symbols in Shannon entropy, you know the number of symbols and that is what you use to calculate the entropy.  But the energy in a physical system can be distributed among N or fewer particles which results in a N! divisor instead of an N^N divisor in Shannon's N*H = N*log[(states in N symbols)/(N)] plus using a log rule.  This gives physical entropy more possible ways to use the N particles than information can use N symbols. 1 particle carrying the total energy is a possible macrostate (not counting the minimal QM state for the others), but information entropy does not have a "constrained only by the sum" of an external variable like this to use fewer symbols. Physical entropy seems to always be S=kb*N*[ln(states/particle)+c] and the difference from information entropy is the c. But the c is generally not a huge factor and might become insignificant in bulk matter where the energy is spread equally between bulks, then physical entropy is then S=N*So.  Information entropy is perfectly like this (S=N*H), but you can't derive So from H, but it's always true that Sbits=S/(kb*ln(2))  (i.e., the exact state a system is in can be specified by answering  Sbits of yes/no questions.   

To show the exact equivalence between physical and informational entropy, I have to precisely define and constrain an informational situation to match physical entropy.  It goes like this:  the message to be sent will consist of 1 to N symbols. The number of states each symbol can have (the size of the "alphabet") is n to n/N as you go from 1 to N message length.  In this concept I would like to correlate n with total energy, but disastrously to my desire for simplicity this is not so. The number of states in each N as you go from 1 to N depends on a combination of the size of the box and the total translational energy. To add more trouble, momentum is a v factor and energy is v^2 which results in a 5/2 factor (otherwise it would be a ^3 factor having to do with 3 dimensions).

The simplest way to view the exact correlation is to view a gas in a box as a sequence of messages of length 1 to N based on the number of particles carrying the total translational energy (heat) and then calculate the number of states (the size of the alphabet) they each may posses.  So the correlation is a sum of the entropies of different messages of increasing length, with a decreasing alphabet size in each message. The way the alphabet size decreases from 1 to N is an equation determined by the size of the box and the total heat.  It's nice to finally understand it, but I hate this conclusion!  The energy is distributed in N particle velocities, and it may use less than N, and it's energy per particle times the time to cross the box that determines how many states are possible for each of the N that are carrying the energy. 

So Shannon's entropy is a lot simpler and comes out to LESS entropy if you try to make N particles in a physical system equivalent to N unique symbols.  The simplest physical entropy is of independent harmonic oscillators in 1D sharing a total energy but not necessarily evenly is S=kb*ln[(states/oscillator)^N / N!] which is  S=N*[log(states/particle)+1] for large N. So even in the simplest case, the c remains.  Shannon entropy is of a fundamentally different form: S~log((states/symbol)^N) = N*log(states/symbol) when each symbol is mutually independent (no patterns in the data and equal symbol probabilities). For example, for random binary data S=log2(2^N) = N bits.  So it is hard to see the precise connection in the simplest case, even as they are immediately shown by true/false questions to be identical quantities with a simple conversion factor.  Stirling's approximation is exact in the limit of N and Shannon's H depends in a way on an infinite N to get exact p's, so the approximation is not a problem to me.

I have not contradicted anything user346 (physics stackexchange question) has said but I wanted to show why the connection is not trivial except in the case of looking at specific entropy of bulk matter. QM uses S=sum(-p*log(p)) but Shannon entropy is S=N*sum(-p*log(p)). They come out the same because calculating the p's is different. Physical's p=(certain macrostate)/(total microstates)  but the numerator and denominator are not simply determined by counting.  Information's p=(distinct symbol count)/(total symbols) for a given source. And yet, they both require the same number of bits (yes/no questions) to identify the exact microstate (after applying kb*ln(2) conversion).

But there's a problem (in calling black hole entropy the maximum amount of informtion bits that can be stored) which was mentioned in the comments to his answer. In an information system we require the bits to be reliable. We can never get 100% reliability because of thermal fluctuations.  At this limit of 1 bit = kb*ln(2) we have a 49.9999% probability of any particular bit not being in the state we expected. The Landuaer limit is definitely a limit. The energy required to break a bond that is holding one of these bits in a potential memory system is "just below" (actually equal) to the average kinetic energy of the thermal agitations.  Landauer's limit assumes the energy required to break our memory-bond is E=T*kb*ln(2) which is slightly weaker than a van der waals bond which is about the weakest thing you can call a "bond" in the presence of thermal agitations.

So we have to decide what level of reliability we want our bits. Using the black hole limit also seems to add a problem of "accessibility".  It is the information content of the system, but it is not an information storage system.