Friday, August 18, 2017

Bitcoin Cash & SegWit2x causing BTC-chain splits to have constant value, increasing advantage of the 1% hodlers

the merkle says it's insatiable demand from South Korea. People are going to want SegWit2x fork even more due to the success of BCH and that will drive BTC itself even higher. It seems like BTC is headed towards the ideal constant-value currency simply by forking. But totally unexpectedly, old holders will profit even more due to the forks. BTC hodlers are becoming exactly what they were fighting against: banks and governments (esp U.S.) profiting from inflation of a coin. The forks make it feel easier for new entrants to come on board and the "1%" that joined early continue to profit massively. But when others profit so massively, it comes out of the 99% one way or the other. BTC is not zero sum, but the eventual total off-chain real world assets it will control are limited to how fast world GDP can grow. The "1%" BTC hodlers will gain a disproportionate share of those assets, just as the U.S. enjoys "slave labor" in Asia simply from having the world's default currency due to a "might is right" military (which itself was largely a geographic advantage of WW2). We are able to print new dollars just as fast as world GDP can expand without causing inflation. We print 2% or 3% even faster, causing persistent mild inflation, which prevents the 1% from forever keeping their wealth unless they reinvest it in good ideas that get more than the 2% or 3%. Likewise BTC forks have a might is right advantage in both devs and mining by being well planned and 1st.

what I mean by "BTC" inflation is that it is chain-splitting. BCH and impending SegWit2x means there will be 3x more "BTC". I can't see how it's any different from alts, except that there is a huge marketing advantage to new entrants (aka the 'unwary') to calling them "bitcoin" (flavors?) but also existing BTC hodlers have no need to complain or fight against them like they do with alts. This last point is my perspective that a lot of other people hold, even if it's not correct. The effect is the same: the perception of new entrants and existing hodlers is more important than most facts when it comes to a coin's value. And our ignorance (if new entrants and hodlers are "wrong" about the usefulness of a new coin) that results in higher value can in turn spur development back towards a more useful coin. The 3 different forks could evolve back into equivalent code simply because people "wrongly" believed in the initial bad ideas. The macroscopic invisible hand driving all this is the desire for constant value. The visible part of the hand is simply new entrants not wanting to pay high prices and hodlers liking the extra coins from nothing.

Friday, August 11, 2017

Strong Drink mix for Parkinson's

I have been putting together a really strong drink. I guess it's about $15 a day, with most of the cost being in the powder extracts, $2 to $3 per day each, straight from China in bulk.

12 oz pomegranate juice from Hispanic store (not the expensive POM)

added sweet concentrates:

=================

black cherry concentrate 12 g

black molasses 12 g (sugar cane juice after most of the white sugar is removed)

Jallab 12 g (Arabic grape skin extract plus others)

powder 10:1 extracts:

==================

blueberry extract 12 g (my eyesight sharpened enough to not need my barely-needed glasses in 4 days)

strawberry extract 12 g

apple peel extract 12 g

tangerine peel extract 12 g

Citrus flavonoids with animal studies in PD, bought from china in bulk.

These doses are 1/4 the human-equivalent doses because the studies are "shock" studies on the animals by which I mean they are very short term to see how the chemicals work in response to PD-like toxin challenges.

===================================

nobiletin 500 mg

naringin 300 mg

tangeretin 100 mg

Other stuff in pills with strong animal and epidemiological evidence for PD and ability to absorb and cross blood brain barrier (pills not in the drink):

====================

black tea extract

green tea extract

grape seed extract

fisetin

(inosine to be added)

The American producer of patented fisetin is not clear that it is pure fisetin and the brand is hiding details about what it is, so I'll spend 1/3 as much to get pure fisetin from China and then sell the excess on ebay. Inosine in bulk is also 1/3 the cost from china.

Canola mayonnaise, the bomb!

Broccoli, Sardines, home-made very yeasty beer, olive oil

1 hour exercise, then drink it to absorb the sugar.

Tuesday, August 8, 2017

Potential value of bitcoin, empires, and taxes

M3 is roughly "all cash". For US dollars, it's about $30 trillion. The rest of the world I'll estimate at $25 trillion because the Euro is about $12 T in USD. I expect bitcoin and alts to roughly follow this ratio, so BTC would be compared to dollars. so the max would be 30T/21M = $1.5M per BTC. As this happens, dollars all over the world will come flooding home making them worthless, so it's more important for Americans to switch early than in other countries, to maintain current lifestyle. The ability to print dollars and the growing world economy accepting them has been the biggest boon any country has ever seen. We were basically allowed to print them as fast as the world's economy grew. We spent half the surplus on a military which pushed and supported the use of the dollar, enabling stability and exchangeability in the same way MicroSoft "helped" software. Bitcoin is the Linux of money. Wealth will be more evenly distributed as the dollar monopoly in currency ceases.

Not just the U.S. but all governments will lose power to control if they lose control of the currency that their citizens demand. Countries enforce a currency by demanding taxes and legal disputes be settled in their dollars. Empires use currency to enslave other countries. So countries can start their own cryptocoin, enabling them to enforce law more directly and automatically extract taxes by being privy to every transaction. This would relegate BTC to replacing only gold for private holders, which is $7T, $350,000/BTC, but potentially a lot more since a lot of countries want to be more fair in international exchange instead of being stuck with the dollar. Gold is harder to move so there's a great desire to switch to BTC. Also, buying stuff directly from other countries instead of Amazon will need BTC, which is a "black market" as far as the U.S. government will be concerned. It takes away their power as the dollars come home. They can make it illegal to import things from other countries without dollars. The U.S. desperately needs dollars to stay out of the country. When foreigners like the Chinese government start giving us wads of dollars to get BTC, that is NOT the time to switch back to dollars. That is the end of the U.S. as the world power. That is when great powers fall: they spend too much on military to support a coin or gold, lose their skills at production due to enslaving people in the distant lands (via the coin supported by the military), then find themselves powerless as their coin collapses. In the case of Spain getting gold, they just spent it all, then the armada fell and Britain's superior skill at ship building took over. There is also the possibility that BTC will be a basis for establishing ownership of assets and enforcing smart contracts, again putting it well over the $1M/BTC range. I do not expect it to go over $500,000 in 20 years. If it reaches $100,000 it will be a primary way of buying $1M beach houses as old money finds itself increasingly poor and BTC millionaires start looking for something to do with their gains.

50x = $150k/BTC does not need BTC to be lucky. It only needs to be the best idea. When BTC reaches $150k it will be because it is starting to be used as an international standard for trade. It will be because dollars are coming home which will make them lose all their value. The U.S. government will then have to decide to cancel all social security and most government expenses (pollution, law, roads, retirements) and foreign debts, or print money (hyperinflation). That will not stop the inflation because there will be 3x more dollars inside the U.S. from foreigner not wanting them. If it takes 15 years, that will be 7% inflation plus our current 3%. 10% inflation is far from hyperinflation, but still a disaster. Actually the disaster was letting there be a "balance of payments" surplus the past 50 years which means more money going out (via free trade, military, and dept) than what was coming in. This results in erosion of the country's ability to support itself. Free trade is a disaster if it makes the balance of payments worse. The U.S. (like China) got out from under enslavement of a foreign currency by enacting trade tariffs. China's devaluation of currency is in effect a trade tariff on the external world's imports which forces its people to work harder and develop more skill. I believe the U.S. is sophisticated enough not to have hyperinflation. When it reaches $150k, it is NOT the time the sell, but a time to keep holding, unless you see a better option. But I think a capped-quantity coin is not a good solution and not the solution the rest of the world will want due to late-comers being at a disadvantage. But unless a new coin lets smartphones determine their own time via the stars or random or 3rd party consensus trust, and combine it with a local trust network to decentralize the coin (protecting it from big miners), BTC may be the best option. This is because all alts subject to 51% can be destroyed via simple forward-stamping timestamps, and if BTC miners are hodlers, they will soon find it more profitable to destroy alts than to mine, forcing more money into a few coins. They may even use their BTC value gains to buy more equipment to retain power by destroying alts instead of mining BTC. The miners may turn into BTC's military. This is what happens to all empires: they win by might is right until all the slave countries figure out a way to get out from under the coin that controls them. The coin is backed by a military. Coins are how governments exert control. Some argue BTC has no government, that devs are not really in control. However that may be, anyone who holds BTC will be the new lords, enslaving the late comers, backed by our military, the miners. At least this is our best-case scenario in our search for personal profit.

Thursday, August 3, 2017

Shaking marbles in a jar as the simplest exposition of evolution, A.I., and economics


I am not using "entropy" in this article in a vague sense. I am talking about the entropy measured by physics equations.(see note if you're interested in the physics)

Executive Summary of the Physics
Physical evolution in its simplest form is shaking a jar of randomly-packed marbles in a gravitational field. In short, complex (but not completely random) energy injected into a "closed thermodynamic system" results in entropy being released to the universe as black body radiation (more low-energy photons go out than high-energy ones came in...all energy transfer is a sending of photons). Since entropy is a conserved quantity, the entropy inside the container is reduced. A reduction in entropy exhibits itself as a higher degree of order by becoming more densely packed, harder, and repeating patterns. The 2nd law is not "entropy always increases" (see good physics texts such the famous Feynman lectures on thermodynamics that mention this). "Always increases" is an engineering idealization for isolated systems. There is no isolated system in the universe thanks to black body radiation. Gravitational systems are emitting entropy "so that the Universe can expand" (or alternatively "as a result of the Universe expanding"), leaving a lower-entropy state behind. (more detail: The entropy of the Universe is constant on a co-moving volume basis and reducing on a constant-volume basis. This last statement is not speculation but I do not have a good reference. It is not required for the following, but provides the deepest explanation for the origin of life that I have.)

The physics of shaking marbles in a jar
When you randomly and slowly place marbles in a jar they will pack with about 56% fill ratio, leaving 44% space. If you shaking them afterwards, starting first with hard shakes and then softer shakes, they will pack with > 63%. The harder shakes allow for the bottom layer to form first. The softer shakes allow the higher levels to settle without upsetting the lower levels. The highest theoretical packing is 75%. The shaking can't be completely random and consistent because it will continually un-pack. Non-random shaking can be thought of as a "periodic" or "semi-periodic" force (or energy injection). For packing differently sized and shaped small objects there is a more complicated way that does the same basic thing: add heat while lowering pressure, then raise pressure as the heat drops, and then repeat, but lowering the temperature and pressure each time (see wiki's "Simulated annealing"). The heat checks options while the pressure that follows secures and compacts the solutions. The pressure is a force like gravity. The heat is the shaking.

Chemical bonds
I'm not going to hardly mention the chemical bonds that are crucial to life other than to say the reduction in entropy in the products of life can be measured via molar aka specific entropy. There exists in chemistry potential energy gradients due to charges just as in gravity. They can react to form durable bonds that releases entropy and heat to the universe, leaving behind more durable arrangement with lower entropy. Getting into specific chemical bonds can and should be done, but for this post it would be like getting into the motivations of an employee in a company when I only want to discuss company-wide profit seeking that enables the employees to exist. Similarly, when getting into employees motiviations, we do not address the chemical reactions in his brain, nor the genes and environment that led to those reactions. But this outline is trying to show how the "environment" (the Sun, Moon, and Earth via thermodynamics and cosmology) resulted in those genes based on physics as opposed to some nebulous beliefs about "evolution".

Correlation between the jar and life on Earth
The Sun, moon, and Earth's rotation are the initial source of non-random energy coming into the biosphere. The "non-random" (low entropy) placement of the moon's mass away from the Earth has been crucial for life's development (Isaac Asimov once discusses this). Its effect decreases with time as the moon get further away each year, in keeping with the jar analogy but I do not want to stress the "decreasing" part because it is not as important as simply not being random. The effect of the moon is apparent to NASA's life-hunters: the most promising places seem to involve an external gravitational force periodically affecting a celestial body (usually moons close to a planet). The Sun is more important via photosynthesis these days, but I have investigated the extent to which life initially capitalized on ("extracted") the low entropy created by the moon's initial placement. I calculated in a previous post that today the Sun is providing 150x more energy than the yearly loss in Earth's rotational energy due to the moon, but at the beginning of life, the moon was >3x closer with >9x more gravitational effect, and the Earth was turning a lot faster. My calculation (from available data) indicated the moon provided about 20% of what the Sun was providing. This is a "mass moving" quality of energy rather than the "heating" quality provided by the Sun. The combination may have been crucial: periodic heating was a "periodic shaking" at the molecular level while the "mass moving" shaking provided a larger-scale directional force to where the resulting molecules would go. I considered the parallel this has with law guiding individual free market transactions, but have rejected it. Law will be the jar within Earth's jar.

As the moon gets further away, the entropy of the Earth-moon system is increasing due to a volume increase. The Earth's rotation rate is slowing, giving some energy to the moon in order for it to get further away, but a much larger percent is used to churn the air, seas, and mantle via the moon's gravity.To what extent were an excess of ocean vents present (where the oldest known life fossils have been found) due to the moon churning the mantle? The second oldest place fossilized life has been found is in bays with ocean tidal zones, which is more obviously assisted by the moon. To what extent would our massive economic machine not be possible if the moon had not churned the mantle enough to make more ore concentrations possible via volcanic activity? That non-random placement of mass is again a physically-measurable lower state of entropy that resulted from a lowing of the Earth's rotation via the moon.

I have cast it as an low-entropy injection from the Earth-moon system's volume increasing, but most of the lower entropy is coming from balancing the black body radiation (entropy) being emitted as a result of churning/heating caused by the Sun and slowing of the Earth's rotation.

The Sun affected molecules with periodic (daily) heating to enable them to form stronger bonds, a key ingredient for life (the success of "genes" is a combination of ability to spread quickly and last a long time via the really strong bonds of DNA crystals). I will not dive into chemistry because I think that is missing the forest for the trees. It is important to keep in mind genes have no force of their own. They are just enzymes. Future genes model on past genes. Crucial to efficient packing is that the next layer of marbles are guided by the lower levels, obviously copying their arrangement. In what sense is this not like genes? Obviously the marbles do not evolve diversity, but I'm only allowing one type of object into the jar to try to see the essence of physical evolution.

I should mention the Sun and rotation combine to cause a bulk periodic movement of mass like the moon, not just heating. A side effect of this causes charge potential energy gradients that result in lightening strikes which been theorized to be crucial for life. Those strikes are like sudden large "shakes of the jar" at the molecular and atomic layers.

I discussed this in more detail earlier this year, but the above section is more efficient and has new material to show more parallels.

Correlation to artificial intelligence
A large part of A.I. in finding the solution to complex problems is starting with random values in neural nets, Bayesian probabilities, and genetic algorithms. The marbles (in our brain) are the neural nodes, Bayesian nodes, or genes (well, not genes exactly). You typically start with random values and give the program computational energy (shaking) for the nodes or genes to request CPU time and memory (energy) while they are under a system-wide A.I. algorithmic constraint (the jar) and a gravity provided by the training data to find an efficient (low entropy) solution. The solution will show many patterns in the node weights, probabilities, or genes which is the marbles' repeatinng arrangement. The patterns may be eliminated for condensing the node weights, probabilities, and genes into a smaller set of nodes that is more random on a per weight/probability/gene basis. This is like taking marbles out of the jar which is less entropy by a factor of roughly equal to N2/N1 (this is exact if it's a specific molar entropy) and it allows a smaller jar which is also less entropy by a factor described at the note at the bottom. The final result is not a repeated pattern but it is the same(?) or less total entropy. It is more entropy on a per weight/probability/gene basis in an absolute sense, but the entropy per weight/probability/gene "per this system" is the same or less. Nodes in neural and Bayesian nets might be identical in an algorithmic sense (like the jar constraint) and change only their weighting factors (placement). In genetic algorithms, the genes would repeat and/or reduce in number. So the algorithm representing the jar is the set of rules of gene interaction rather than a description of the genes. The genes vary while the marbles do not. There is only 1 type of marble. To compress the marble packing to reduce the number of marbles in the jar, but to completely describe the pattern, only a few marbles are needed. A.I. seeks to find repeating patterns of marbles and normally compress into a single pattern. Fewer marbles are better if they are smart.

Correlation to economics
The A.I. must have some sort of direct, implied, or unnoticed limited-quantity currency that corresponds to the amount of CPU time (FLOPS) and memory that is available, if there is a limit on the amount of CPU power (FLOPS) and memory space that the algorithm can use. The currency is the energy being transmitted from the jar (the algorithm constraints) to each marble (the weight/probability change or gene replication). The CPU time and memory are kinetic and potential energies. The hardware itself is a potential energy. The currency quantity corresponds to computing quantity. If the hardware increases then the currency can be expanded by the algorithm (the government) to keep constant value so that nothing else in the algorithm needs to change. This is like increasing the level of shaking: the currency is the amount of energy being transferred from marble to marble from the jar. The energy comes from the governing jar or A.I via the permission granted by possession of the currency and eventually goes back to it (taxes). In my jar there is no currency I can point to except the energy itself. Although the energy coming in and size of the jar may not change, the entropy of the system gets smaller as it gets more efficient. The lower entropy means better command, control, and strength that is typically used to increase the incoming energy, the "size of the jar", and the number of "marbles". If the allowable nodes or genes are increased, then the value of the currency per node or gene decreases by that same proportion if the hardware has not improved because they will have to compete to get the same computer time and memory. So the currency quantity needs to decrease if the currency should represent the same amount of FLOPs and byte space.

In other words, for contracts to remain valid, the currency quantity should change in proportion to the energy per person that is under the system's control.

Physics note: All other things being equal, the entropy increase of the moon getting further away is S=S2-S1 where S1=a*[ln(b*V1) + c] where V is the volume of the Earth-moon system and a, b, and c are constants. There is rotational energy decrease and gravitational energy increase, but these are internal energies that do not change the kinetic energy of the system (that could have affected the entropy) because the Earth's temperature is about a constant. But those lost energies do emit a lot of entropy away from the system as waste heat. As another example important to the following: harder materials have lower entropy due to the atoms having fewer places per volume that they can occupy. Specifically, for a single harmonic oscillator (in a solid) S=k*ln(kT/hf + 1) where f is frequency of the oscillations which is higher for stronger bonds.

Tuesday, August 1, 2017

difficulty algorithms

===============

# Zawy v1b difficulty algorithm (nice and simple)
# Based on next_diff=average(prev N diff) * TargetInterval / average(prev N solvetimes)
# Thanks to Karbowanec and Sumokoin for supporting, refining, testing, discussing, and using.
# Dinastycoin may be 3rd coin to use it, seeking protection that Cryptonote algo was not providing.
# Original impetus and discussion was at Zcash's modification of Digishield v3. The median method
# Zcash uses should be less accurate and should not be needed for timestamp error protection.
# Wider allowable limit for difficulty per block change provides more protection for small coins.
# Miners should be encouraged to keep accurate timestamps to help negate the effect of attacks.
# Large timestamp limits allows quick return after hash attack. Needed to prevent timestamp manipulation.
# (1+0.693/N) keeps the avg solve time at TargetInterval.
# Low N has better response to short attacks, but wider variation in solvetimes. 
# Sudden large 5x on-off hashrate changes with N=11 sometimes has 30x delays verses 
# 20x delays with N=17. But N=11 may lose only 20 bks in 5 attacks verse 30 w/ N=17.
# For more info: 
# https://github.com/seredat/karbowanec/commit/231db5270acb2e673a641a1800be910ce345668a
#
# D = difficulty, T = TargetInterval, TS = timestamp, TSL = timestamp limit

N=17;  # can possibly range from N=4 to N>30.  N=17 seems to be a good idea.
TSL=10 if N>11 else TSL = 0.8*N; # stops miner w/ 50% from lowering  D>25% w/ forward TS's.
current_TS=previous_TS + TSL*T if current_TS > previous_TS + TSL*T;
current_TS=previous_TS - (TSL-1)*T if current_TS < previous_TS - (TSL-1)*T;
next_D = sum(last N Ds) * T / [max(last N TSs) - min(last N TSs] / (1+1/N);
next_D = previous_D*1.2 if next_D < 0; 
next_D = 2*previous_D  if next_D/previous_D > 2;
next_D = 0.5*previous_D  if next_D/previous_D < 0.5;
==============

I had thought the following was best, but if I use v1b with N=8 and replace 1/(1+0.693/N) with
*0.89 then they are about the same.

===========
#  Zawy v1c difficulty algorithm  
# For more info: 
# https://github.com/seredat/karbowanec/commit/231db5270acb2e673a641a1800be910ce345668a
# D = difficulty, T = TargetInterval, TS = timestamp, TSL = timestamp limit
# N should not be changed
N=16;
M=int(N/2);
O=int(N/4);
k = 0.85; # adjustment to keep avg solvetime correct when hash rate is constant.
TSL=9; # should not be changed
current_TS=previous_TS + TSL*T if current_TS > previous_TS + TSL*T;
current_TS=previous_TS - (TSL-1)*T if current_TS < previous_TS - (TSL-1)*T;
temp1=  k * sum(last N Ds) * T / [max(last N TSs) - min(last N TSs];
temp2 = k * sum(last M Ds) * T / [max(last M TSs) - min(last M TSs];
temp3 = k * sum(last O Ds) * T / [max(last O TSs) - min(last O TSs];
next_D = 1/3*(temp1+temp2+temp3);

==================


# Measures relative effectiveness of difficulty algorithm under attack scenarios.
# The algorithm with the lowest "attack" number wins.
# Getting blocks at low diff attracts attacks, so they are weighted same as long wait times.
for (  i = all block numbers )  { 
      for ( j=0 to 9 )  {  total_ST += ST[i+j]  }
      if  (10-total_ST/T) / SQRT(total_ST/T) > 2.5  sum_diff_was_too_low += total_ST/T/10 ;     
      if  (10-total_ST/T) / SQRT(total_ST/T) < -2.5  sum_diff_was_too_high += total_ST/T/10 ;     
}

=================
# zawy v3c  difficulty algorithm, 2nd best
# same as v3b with the following changes
# replace N=17 with
N=16 
sumNs = N/2*(N+1)

# totalHashrate = totalHashrate + D[i]/ST[i] * 4 * (1-e^(-ST[i]/T)) * e^(-ST[i]/T))
# is now

totalHashrate = totalHashrate + D[i]/ST[i] * 4 * (1-e^(-ST[i]/T)) * e^(-ST[i]/T))*(N+1-i)/sumNs
================
# Zawy v3b 
# ST=Solvetime, T=TargetInterval
# next_D = T * avg(N linearly mapped D/ST) 
# UNLESS past N/2 ST's are > 2.5 std devs too long (after an attack) 
# THEN use v1b for N/2 blocks:  next_D = T * avg(N/2 of D) / avg(N/2 of ST) / (1+0.69/(N/2))
N=16
M=int(N/2)

&correct_negative_solvetimes

# check to see if last M blocks were dropping significantly fast.
Mexpected = Sum(last  M ST's) / T
std_dev = (M - Mexpected) / SQRT(Mexpected)
if (std_dev > 2.1 OR wait > 0 ) then
    next_D = sum(last M Ds) * T / [max(last M timestamps) - min(last M timestamps] / (1+0.69/M)
    if (wait == 0) then (wait = M) else (wait=wait-1)
else 
   totalHashrate=0
   for i=1 to N
        totalHashrate = totalHashrate + D[i]/ST[i] * 4 * (1-e^(-ST[i]/T)) * e^(-ST[i]/T))
   next i
   next_D = T * 0.64 * totalHashrate / N  
end if
# finished

sub correct_negative_solvetimes {
# Eliminate negative ST before the mapping loop above. 
# N = number of block going in reverse.  N=1 is previous block.
# The following ST changes are permanent.

# Avg past 2 ST's if this ST is < 0.
if ST[1] < 0 then 
    if  ST[1] + ST[2] > 0 then 
        ST[1] = ( ST[1] + ST[2] ) / 2
        ST[2] = ST[1]
    else
         next_D=previous_D
         exit / return completely....get out of next_D routine
    end if
end if
if ST[2] < 0    #   ST[1]+ST[2] in previous block was neg. Previous ST[1] is now ST[2]
    if ST[1] + ST[2]  + ST[3]  > 0      
       ST[1] = ( ST[1] + ST[2]  + ST[3]  ) / 3
       ST[2] = ST[1] 
       ST[3] = ST[1] 
   else        # there were 2 or 3 negative timestamps in a row.
       # We don't know if ST[2] and ST[3] were early or if ST[4] and/or ST[5] were late.  
       # Best and safest thing is to make them median ST
        ST[1] =median ( ST[1] to ST[N] ) 
        ST[2] = ST[1]
        ST[3] = ST[1] 
    end if
end if
}


=========

#!/usr/bin/perl

# Zawy v1 algo for detecting hash attacks when diff algo is not responding fast enough (July 30, 2017)
# Let me know if there is an error:  zawy@yahoo.com
# Only detects when the diff algo is not responding fast enough.
# Prints a graphic to command line and writes block, solvetime, and difficulty to diff3.dat 
# For difficulty algorithm ideas (zawy v1c recommended)
# see https://github.com/seredat/karbowanec/commit/231db5270acb2e673a641a1800be910ce345668a

$instructions="
Detects hash attacks.
This diff3.pl file needs editing for the coin-cli location and the block interval.
Output goes to screen and diff3.txt
Command line format:
perl diff3.pl  
Default is past 500 blocks\n";

unless ($#ARGV == 1 or $#ARGV == -1) {   print $instructions; }

$coin_name='hush';
$dir="./$coin_name/src"; # Use "." if cli program is in same dir as this script. No ending "/".
$n=500; # number of blocks in past to look if start and end block not passed
$m=10; # Number of blocks to check for statistical aberration. Recommend m=10 to 15. 
# m < attack length is good. 
$ti=150; # the coin's targetinterval in seconds.  150 for 2.5 minute blocks.
$sd=3; # std devs of statistical significance.  Recommend sd=3.  See next paragraph.

# sd < 3 will result in too many false positives. sd = 4 will get only 1/4 of smart short 3x attacks
# The number of hits on accident are the 2*(one_tail_probability)*W/15*n/m
# where W=block in your diff algo averaging window.  sd=3 is 0.0015 for one_tail.  
# Expect to have a false appearance of attacks more often if W is larger or m is smaller.
# Short < W/2 attacks that are 2x network hashrate are detected 1/2 time and more often are just noise.
# But if m < 1.5 attack length, then sd=2.5 detects 80% of 2x attacks with minor false positives.
# Short 3x attacks are 80% detected with little error from noise.

if ($#ARGV == 1) { $n=$ARGV[1] - $ARGV[0]; }

@a=`$dir/hush-cli getinfo`;
foreach $a  (@a) { 
    if ($a=~s/.+blocks\"[^\d]+(\d+).+/$1/) {$height=$a;}
}
if ($#ARGV == 1) {$height=$ARGV[1]; }

for ($i=$height-$n-1;$i<$height;$i++) { 
   
   # print $i;
   $hash=`$dir/hush-cli getblockhash $i`;
   # print $hash;
   $info=`$dir/hush-cli getblock $hash`;
   $diff=$info;
   $diff=~s/.+"difficulty"\D+(\d+).+/$1/sg;
   $time=$info;
   $time=~s/.+"time"\D+(\d+).+/$1/sg;
   if ($i == $height-$n-1 ) { $prev_time=$time;next;  }
   # print "t: $time d: $diff\n";
   unless ($i==$height) {
      $t{$i} = $time-$prev_time;
      $neg{$i} = "negative" if $t{$i} <0;
      $d{$i} = int($diff);
   }
   $prev_time=$time;
 }
foreach $blk (sort keys %d) {
   $avg_diff= $avg_diff + $d{$blk}/$n; 
   $avg_t=$avg_t+$t{$blk}/$n;
   $j++;
   $sum_t=$sum_t+$t{$blk}; # add solvetimes
   $std_dev{$blk}="---";
   unless ($j < $m) {
      # take out oldest solvetime
      $sum_t=$sum_t-$t{$blk-$m};
      # do statistical calculation 
      $blks_expected=$sum_t/$ti;
      $est_diff_inc_needed{$blk} = $m/$blks_expected; 
      $std_dev=($m-$blks_expected)/($blks_expected)**0.5;
      if ($std_dev > $sd) {
         $std_dev{$blk}=int($std_dev*10)/10;  # store measured std dev if > sd
      }
      else  { $std_dev{$blk}="---"; }
   } 
}

open (F, ">diff3.txt") or die $!;
print F "height solvetime difficulty std_dev\n"; 

print "format: height, std dev if attack detected, scaled difficulty, solve time (amt diff needs increasing) difficulty\n";
foreach $blk (sort keys %d) {
    print "$blk $std_dev{$blk} " . 'O' x int($d{$blk}/$avg_diff*50) . ", $t{$blk} (" . int($est_diff_inc_needed{$blk}*100)/100 . ") $d{$blk} $neg{$blk}\n";
    print F "$blk $t{$blk} $d{$blk} $std_dev{$blk}\n";
}
close F;
print "avg difficulty: " . int($avg_diff) ."\n";
print "avg solvetime: " . int($avg_t) ."\n";
print $instructions;

Friday, July 28, 2017

Hash Attacks against HUSH cryptocurrency

In the past 15 days HUSH has released 10% to 20% of its blocks to miners who have 2x to 3x the network hashrate and come on for about 20 blocks. This is a little high because hush is using something like N=50 for its averaging window for the difficulty. That's why they come on for about 20 blocks. If hush used N=25 they would likely come on for 12 blocks, twice as often. The only solution I know of is to go as low as possible, like N=8. I thought Zcash was using N=17. Zcash could get away with N=50 because they are big. I did the measurement by checking for any sequence of 20 blocks that were solved 4 std devs faster than expected (less than 20 minutes instead of 50 minutes). I checked the validity of my statistics against a generated Poisson and it came out correct. I doubled checked it by trying to get the same avg, median, and std dev for the solve time. The only way I could generate the same data was for 3x attackers to come on at least 6x per day, so >20% blocks may be going to big miners, but I can only directly detect 3x per day (10%). My simulation was 3x attack for 1/2.5 of N and waiting for 3xN before doing it again. The std dev > 4 detection only worked 1/3 or the time or less. The median was 11% lower than it should be. This is a way to directly detect the attacks. There were no zero or negative solvetimes which makes me think HUSH is using a 3rd party for the timestamps which I did not think coins normally do. Whoever controls the clock controls the difficulty and therefore the rate of coin release. The good clock goes a LONG way to protect blocks from being stolen. I've read it negates the reason for mining. If they messed with the timestamp, then they can get all the coins they want in a few hours, up until the point a fork is forced.

They are raising the difficulty. I usually speak of it as a theft and attack, but can be argued it is not a theft, but just the magic of a free market. Asking devs for protection is like asking for government regulation.


The free market argument is that if there were enough of them seeking profit like this, then they would erase each other's profit. They are getting paid to switch around to several coins during the day while the less sophisticated are stuck on one.

Here are hash attacks against hush who has a N=50 window for averaging and karbowanek who has N=17. Most of the big karbowanek swings are reacting to shorter hash attacks.

Wednesday, July 26, 2017

Ideal currency's relation to thermodynamics of Earth

posted to reddit thread

Again, the goal is value stability, which Nash said should be constant. This does not mean constant quantity. From online sources I can't see where I'm in disagreement with Nash, nor how the ICPI can be built backwards from currency competition because he says the ICPI "could be calculated from the international price of commodities". Even my suggestion that money should devalue very much like a 2% inflation target is not an idea that he throws out: [wikipedia again] "The policy of inflation targeting, whereby central banks set monetary policy with the objective of stabilizing inflation at a particular rate, leads in the long run to what Nash called ‘asymptotically ideal money’ – currency that, while not achieving perfect stability, becomes more stable over time." Moreover, the mild inflation idea as a way to slowly erase debts is something very important that Nash's "ideal money" does not take into account, so from what I see he's missing something crucial. It is possible his ideal money (perfect stability in value) is "ideal" only when the marketplace is ideally guided by "ideal law" that prevents money from concentrating into fewer hands in addition to enforcing basic rule of law in each transaction. Wealth can concentrate from loans, monopolies, and lobbies improperly affecting laws. It can also accumulate from efficiency gains due size, but a healthy system requires competition and dispersion, so I suspect allowing a company to gain over 50% of market share should be "made against the law" even if they followed all other aspects of ideal law. What to do with things that need 100% market share like roads and electricity remains a problem (letting two toll roads compete for the same route is rarely reasonable).

But wealth concentration into a few hands seems to be what a completely free market wants. Progressive tax to redistribute it back out is used in most countries. People get very wealthy as a result of riding society's wave of progress and technology more than any particularly great or noble intellect or skill. Instant billionaire status at age 25 after 4 years of work and near-zero capital investment makes no sense. Getting back to my point: mild inflation is a another check on inefficient, less powerful systems that allow wealth concentration. In ancient times, wealth concentration stifled society, then the people would revolt, install a new king or priest then all debts to the old "lords" weer erased. Jews learned to do it as a matter of course from other societies by declaring "jubilee" every 50 years. Now we have inflation instead.

I can agree with "ideal money" instead of mild inflation if a different fundamental assumption is made: if the wealth accumulation is prevented by something like progressive tax and smart laws on loans, then forcing constant value (zero inflation) on humanity would force everyone to become more efficient before expanding and go a long way to slow population growth and destruction of the biosphere (and in particular, not run into resource constraints so quickly, causing population collapse).

But in reality faster expansion and waste that is helped by low inflation will overpower the ideal money's more conservative expansion.

I am not sure currency should or needs to follow Zipf's law, but it appears it does, and therefore probably should. I agree competition in currency will result in an ideal currency, but knowing what the ideal outcome for people will be (or rather, the most likely-to-succeed outcome) is how you design the ideal money from scratch and then make it available. It also helps me in investing: I am more bullish on bitcoin if it forks. These ideas cause me to predict it will fork several times.

It must fork in order to achieve stable value as its use expands, either by alts or hard forks (which I guess is another name for an alt with a big pre-mine). But the ideal is to find an objective definition and measure of commodity prices and figuring out the best way to expand and contract the supply as the prices change. I agree defining and measuring is a big problem. There must be an objective theoretical way to define "commodity" and the weighting factors needed. An even bigger problem is measuring it without a 3rd party. If the market is big enough does everything become a commodity? Should they be weighted based on how much people spend on them? I believe it also needs slight inflation to help prevent concentration of wealth and the loan problem.

The consumption (destruction) of joules is sometimes used by people like Szabo and XCP (and implied in bitcoin and b-money) to claim "this object has value because a lot went into its creation". Rarity and antique-ness are used in a similar fashion. Trading a commodity itself that required joules to create and maintains economic relevance of the joule's spent (e.g., copper, etc) seems to be a much better idea. If we could trade a basket of commodities around with the ease of a fast cryptocoin, then we have an ideal.

The starting point for how to weight the commodities is my previous suggestion: a coin = a fixed percent of Gibbs free energy available in society divided by the number of people. By replacing my initial "joules" with a ratio, I've potentially removed the necessity of it being related to joules. But joules seem to be the primary thing that creates and runs commodity producers. To be clear, 1 coin out of 100M coins would represent 1/100M control of the "total current commodity output".

% coin owned of the total in "circulation" = % of society's total commodity-producing wealth divided by # people

So if commodity production capacity and number of people is constant, then 1 coin is worth more in joules than in the past. So maybe I should not mention joules to others, but it's important to my attempts to connect physics to evolution and economics. ( The release of entropy from Earth is deeply connected to the expansion of the universe. I've work on that part as much, but it requires an audience that is already aware that the 2nd law of thermodynamics is not that "entropy always increase" (see Feynman's famous intro to physics books the 1st or 2nd thermo chapter) and that entropy is conserved on a comoving volume basis. )

The velocity and availability of money and how to measure these will affect this in a way that I have no idea on how to measure. What is the total coin that is available for use at any given time? It might be locked up in escrow based on contracts. How do asset price fluctuations fit into this? Just watching commodity prices seems to solve all that.

If the economy has gotten bigger as measured by commodities trying to become cheaper, then the government would print new money to build more infrastructure to support the bigger economy (law, schools, roads) if the increase appears permanent, but it should be by lower interest rates if it seems like a temporary swing. The government should be the only bank and primary loan agent. Interest on loans would replace all taxes on middle class.

Nash's ideal money seems to miss the final key ingredient of "value" that I've included: dividing by the number of people. It has the macroeconomic effect of weeding out weak people because coin (commodity-producing joules per person) increases for everyone if the number of people decreases. There would be a macro-economic invisible hand to not overpopulate. It stays constant if every person is more efficient but the joules each coin represents would be more. So our measure of value has an intrinsic aspect as to what other people have. If you consider this I believe it will resolve the "efficiency" problem we both mentioned. So even with the low inflation I seek, the ideal money could discourage overpopulation and overly wasteful expansion.

The end result of all this is that Earth's mass will continue the current unavoidable physics-based path of congealing and hardening until all biology is replaced by machines. The primary goal of devs is to replace brains just as motors replace muscle, and solar cells are replacing photosynthesis (20x more efficient per meter^2 and more efficient at self-replication). Silicon is better than wet brains because the lower entropy per mass of hard silicon allows precise control of electrons whereas the brain has to deal with ions that weigh 50,000x more. Photosynthesis has 20x less ability to control the electrons via photon energy.

The goal of companies is to produce the most with the fewest people and with the fewest shareholders (stock buybacks and venture capitalists). People are so outdated, even their capital is not needed by the machines except to gain market share (hotmail buyout, youtube buyout, snapchat, etc). AMZN is an exception. See the zero marginal cost society. Welfare has been disastrous for the people in the U.S. receiving it, as bad as a 30 year war on that population. The same thing will happen to the rest of humanity as we are no longer needed. There's no good or bad. There's no solution. It's just the matter on Earth congealing to a colder, harder state in keeping with thermodynamics.

Friday, July 21, 2017

A few random thoughts on Bitcoin and cryptoassets

The proportion of hash power a coin has for a given POW across all coins is probably going to be proportional to all real world assets that fall under the legal control of those coins that use that POW (that is hardware-constrained...if the same hardware can be be used on 2 POW's then in this context they are the same POW because the work is real, being the Joules to create and run the equipment). There's real value in being the biggest kid on the block because it makes it harder to break. 90% of alts right now could be destroyed if the largest pool of the largest coin (for a particular POW) decided to attack and forward timestamp every block to the max. Difficulty goes to zero and all coins are released (or as many as the attacker wants) in a few hours with only 60% hashrate. There's no fix. If they fork, he does it again. The motivation is to drive users to the big coin that he has been accumulating and he wants to retire instead of buying more equipment, not to mention constantly selling all the small alt coins before they figure it out and fork. This should eventually happen to all coins that have a halving schedule. So the underlying value is being the biggest kid on the block because it is the only kid that can't be "hacked". So side chains may displace alts. The cross swaps do not change this. They will stabilize to an exchange rate in accordance with it. If I'm right, miner economics could quickly reduce the playing field of coins that are truly peer to peer. The only alternative to being one of the very few biggest and baddest is the traditional central database. So BTC's biggest use would be between governments and banks that do not trust each other.

BTC value =(BTC FLOPS) / (crypto FLOPS) x (crypto assets) / (paper assets) ?


It needs to reach $500k per coin to replace the world's $10T M2+ dollar supply, but it's not a currency. But it will more likely find it's value in anchoring the value of other cryptos that will represent legal control of real-world assets. The world's assets are said to be $223 Trillion. so it could not be worth more than $10M per BTC (plus world asset value increases). But if someday crypto become the legal instruments representing ownership of the 1/2 assets, and 1/10 of them ultimately require BTC as a basis, then again I'm back at $500k per coin. If it is both currency and asset, then $1M per coin in an optimistic view. $10k seems to be a decent exit point, but the trend and publics lack of knowledge says at least $50k. So $100k is not a bad guess as a dreamy max.

What's wrong with a BTC fork other than preventing us from getting more free money for nothing? I mean, if it doubles the number of coin so that new entrants can afford it and so that more transactions per second can be made, how is a fork not best for society? All currencies must expand as their use increases to keep constant value so that the terms of wage and price contracts remain valid. Economies should not allow a change in the value of it primary coin anymore than we should tolerate a changing definition of the Joule. All equations (contracts) are invalidated or require adjustment if there is a change. If it does not expand, the currency will not be used by new entrants who will go elsewhere. If its use is forced by government, then limited quantity coins create a 1% class.
===
satoshi's:
I think he might have estimated that if it did absolutely everything he could dream of, then a Satoshi would be $1 each in 2009 dollars. That would be $200T, which was the world's 2009 assets. This is a reasonable distant hope because only the biggest blockchain can be the secure definition of truth because its miners could destroy any other non-3rd party, non-oracle chain with a 51% attack (or timestamp forwarding). Every world asset that wants to define its owner (or vice versa) in the most secure way without a central authority must reference a BTC transaction.
==========
The inability to inflate the references faster than the increase in world assets prevents the references from losing their value. The constant quantity aspect is kind of a cheat and marketing. Early adopters profit at the expense of late adopters to the "faith". Currency has always been directly connected to government / religion. It seems to be about deciding which "God" is in control of the assets via the references. Our ideas of "freedom" originate from the universal ancient tradition of debt cancellation. Bad rulers kept debt in place until the economies were chocked to death. The people would bring a new ruler (or religion) to power and he would erased all debt. That erased the power of the people who were in control and supporters of the previous ruler.
====
reddit message sent to nullc (gmaxwell)
There is an unavoidable, unnoticed macro-economic reason for the fighting: Society wants and needs BTC to expand. BTC wants to fork. Devs think they are fighting to keep it together, but they are actually fighting because macro-economic principles want it to fork. I do not want to have to choose between alts. I want BTC, but I want it to expand with it's use so that it becomes a constant store of value. This will optimally help society by enabling me to write contracts that reference it as "joules of value". Even if it always increases, my renters and I can't use it because we don't know the rate at which it will increase in value. Destruction of joules via making and running hardware is not the defining aspect of "joules of value". This problem goes back to the inability of b-money to equal a basket of commodities as Wei Dai's indicated.
New users want lower price. Old speculators do not mind getting 2 coins for 1. Miners won't mind splitting up to get more coin even it if each is half price because hashrate competition is half. Everyone just has fear of the unknown, and a mistaken belief that capped quantity is best.
Some developers think they have a better idea when deep down they want central control and notoriety. Other's know they have a better idea. Both sides don't realize BTC wants a fork. Let the ideas compete. Let there be a fork and let the market place decide the relative value of each. I'm going to keep both indefinitely because they will settle to Zipf's law (#2 =~ 1/2 value of #1) as ETH crumbles from not following Satoshi's axioms of simplicity and non-3rd party timestamp.
=============
You can derive all characteristics of an ideal money from one underlying goal: constant value. An approximation is a basket of commodities. Going deeper, this approximation is based on the "joules" required to construct and run the commodity-producing equipment. But the "joules" is not exactly a physical measurement. It includes a "difficulty" factor that society has in acquiring and utilizing them. More precisely, it is Gibbs free energy. This may not be the ultimate measure because there is an "efficiency" aspect in how it's used even if the machine is 100% efficient in converting it to usable energy. There might be a subjective nature to how people need to define "constant value". The currency at least needs to expand with the size of the economy to keep constant value. The ultimate goal is to keep the terms of contracts valid. But there is something else just as important: it needs to slowly devalue so that hoarding is discouraged and investment is motivated. This prevents successful participants from relaxing. Evolution does not seek fairness. It seeks power. The most efficient participants retiring is not the goal. It is just a carrot to motivate them. Slow devaluation is a way of erasing old debts which drives a rejuvenation of economic activity, redistributing wealth away from the 1% who naturally use wealth to guide markets into corners or to simply loan out at interest above the rate of coin inflation until they own all the coin via the "magic" of compound interest, further stifling growth from lack of coin and having more people to come beg for a loan.

But the initial and short term goal is constant value so that CONTRACTS have a reference point that is exchangeable between all other contracts in any given legal system that is securing law in a marketplace. The value should be constant in space and time, with some caveats like the needed inflation above, and remote places that do not have a large and diverse marketplace (the coin will have more value there).

That is the background you'll need to understand the following. If BTC is going to be a currency instead of an asset (which provides the backing for real currencies), then it needs to fork as its use expands in order to maintain constant value, or it will have to let alts take away a greater and greater share of the growing cryptocurrency market.

So your point is correct only if BTC is going to be an asset. Let currencies reference it in large quantities in a slow manner in something like the lightening network. Assets and currencies are diametrically opposed if the asset has a capped quantity. BTC and everyone who will use it and currently holds it want BTC to fork into many coins to maintain constant value if it is going to be a currency. That is the underlying cause of the arguing.

I think you failed to appreciate Nash's "value stabilization". Bitcoin is not a commodity. The joules needed to construct and run the mining equipment are a destruction of value but real mining adds value. A real stable-valued commodity is not one that is not going to dry up in the near future. It stops being a commodity in a useful currency sense when that happens. Granted, BTC as a commodity has a lot of features like gold: expensive to mine verses its economic utility and largely capped in quantity (at least without a new tech advance in gold mining). But this is why gold is more useful when a law-abiding marketplace is either non-existent, stagnant, or dying. Growing economies do not need or want gold except as a safety hedge against disaster (see previous sentence).
=============
I disagree that a measure of value in a market place is unavoidably arbitrary. If the "Gibbs free energy" connection is not fundamentally correct, the basket of commodities is pretty good. I believe it (or both) are founded upon letting the currency quantity (adjusted for velocity) be proportional to either the total non-artificially inflated assets in an economic system under its control which might be proportional to the "important fluid assets" of some sort (like the commodities). I think by being proportional to what a group of people have under their "total relative control", it gives a gut feeling to how people define value. If they become rich in having more assets nuder their control then it takes a larger quantity of the currency to feel like it has the same value. This breaks away from a direct joules measurement, but it leaves open the possibility of a joules per joules measure. The joules of value a constant-value coin represents is a proportion of the total joules of value in a society.
The size and efficiency of their commodity-producing and delivering infrastructure should stay proportional to that wealth. If the commodity machine starts struggling to meet demand and prices rise, then the coin is deflated to encourage investment in commodity infrastructure while reducing the strain on commodity production. Then there is the converse.
This discourages economic bubbles. Better commodity-based economics enables larger armies to destroy other economies. Democracy subverts capitalism towards higher commodity production instead of concentration of wealth.
This seems to break away from the joules/joules measurement, but I have a out: an ideal currency should not be a direct joules measure as I initially said, nor the proportion. It should be also divided by the number of people. If commodities got scarce, there is the implication there are too many people for the commodity infrastructure. The commodity infrastructure should be valued relative to the number of people. So the coin is restricted with the commodities to make them both retain the same price (to keep contracts valid) and this makes people have less coin (worth less) and thereby work harder for the commodity production.
I've been trying to discover the connection between evolution, economics, and all other adaptive learning "machines" for years. Here are my latest tweets in this effort. I should point out all machines replacing biology are doing so because they are removing oxygen from metal, silicon and carbon "ores" which results in a far lower entropy per mass of the economic machine. Hardness and reliability are deeply connected to lower entropy per mass. There may be a connection between entropy and coin I have not discovered. Gibbs free energy touches upon it because GFE= joules + pressvolume - entropytemperature which is "energy available for work". So I'll consider the possibility that entropy per mass decrease we are witnessing in our evolving economic system should be connected to a reducing quantity of coin per person.
My last tweets:
Local heat/noise fluctuations under gravity/coin constraints discovers greater systemic efficiency when energy comes in & entropy goes out.
Economics, evolution, & learning are closed but not isolated thermodynamic systems. Energy in, entropy out. Entropy per mass reduces.
===========
If you shake a jar of objects that are in a gravity field, the objects will compact. Compaction is a lowering of entropy when all other variables are the same. Entropy is conserved. The excess entropy escapes during the shaking as low-energy black-body radiation photons as a result of a heat increase from the shaking energy and friction. Energy was converted to lower entropy in the jar, plus even more entropy that was released as the heat escaped.

In evolution, the shaking energy is the Sun and moon. I've written on the importance of the moon to life on Earth. The mass on Earth is constrained to the surface which is the jar.

In economics and A.I. energy obviously is coming in from the outside, and heat is escaping. There is also a lowering of entropy per mass as in the jar and evolution. Law and Earth are the jar. Currency is a conveyance of energy between participants in accordance with the constraints. To allow greater compaction in a jar, sometimes you need more room for new positions to be discovered. In A.I. they periodically relax variables so that they can take on different values before they are slowly constricted again. This enables it to get out of non-optimal solutions. There is also a redistribution of wealth at times in A.I. to make sure it does not get stuck in local minimums (the 1% taking over).
==========
The end game of currency will be a trust network where your reputation among friends and past buyers/sellers is the amount of currency you own to purchase things in the future. You can't lose your keys because your reputation is stored on the network. It's not centralized in any way like bitcoin, except for the protocol people should agree on. Complete anonymity is not possible, but only sociopaths don't have any friends and don't deserve any currency. A super-majority of friends can rat you out or give your keys back. You can't exchange with strangers until the network grows tentacles via 6 degrees of separation. You are penalized if a friend cheats and vice versa. You can have multiple identities but it means you would have to split friends among them, not getting any net benefit except fall-back security and dispersion to distant networks. There is no currency except how friends of friends of friends etc choose to score your reputation. There's no profit to being a dev or adopting early. There's huge profit in not being anonymous.
your FB and Amazon "upvotes" would be carried over by FB and Amazon using the correct XML interface. And if they don't, then companies that do adhere to the protocol to open up the data they're collecting to your friends would win out
=========
DNA = blockchain w/ many forks & new genes = new coins (virus-induced?). Genes = environment's currency to economize resources 4 negentropy

Wednesday, July 19, 2017

A P2P cryptocurrency to replace FB, Amazon, Fiat, and Bitcoin.

Posted to HUSH slack. A prelude to this

Here's an idea for a cryptocoin to build upon the timestamp idea I posted a few days ago (again, that does not necessarily use the stars).

People get more coin by having more "friends" (actually, people you know to be distinct individuals). It might be a slightly exponential function to discourage multiple identities. Your individual coin value is worth more to your "local" friends than to "distant" friends. The distance is shorter if you have a larger number of parallel connections through unique routes. A coin between A and D when they are connected through friends like A->B->C->D and A->E->F->D is worth more than if the E in the 2nd route is B or C. But if E is not there (A->F->D) then the distance is shorter. More coin is generated as the network grows. Each transaction is recorded, stored, timestamped, and signed by you and your friends and maybe your friends' friends. Maybe they are the only ones who can see it unencrypted or your get the choice of a privacy level. Higher privacy requirement means people who do not actually know you will trust your coin less. Maybe password recovery and "2-factor" security can be implemented by closest friends. Each transaction has description of item bought/sold so that the network can be searched for product. There is also a review and rating field for both buyer and seller. For every positive review, you must have 1 negative review: you can't give everyone 5 stars like on ebay and high ranking reviewers on Amazon (positive reviewers get better ranking based on people liking them more than it being an honest review). This is a P2P trust system, but there must be a way to do it so that it is not easy tricked, which is the usual complaint and there is a privacy issue. But look at the benefits. Truly P2P. Since it does not use a single blockchain it is infinitely faster and infinitely more secure than the bitcoin blockchain. I know nothing about programming a blockchain, let alone understand it if I created a clone. But I could program this. And if I can program it, then it is secure and definitive enough to be hard-coded by someone more clever and need changing only fast as the underlying crypto standards (about once per 2 decades?)

Obviously the intent is to replace fiat, amazon, and ebay, but it should also replace FB. A transaction could be a payment you make to friends if you want them to look at a photo. The photo would be part of the transaction data. Since only you and your friends store the data, there are no transaction fees other than the cost of your computing devices. Your friends have to like it in order for you to get your money back. LOL, right? But it's definitely needed. We need to step back and be able to generalize the concept of reviews, likes, votes, and products into the concept of a coin. You have a limited amount dictated by the size of the network. The network of friends decides how much you get. They decide if you should get more or less relative power than other friends.

It would not require trust in the way you're thinking. Your reputation via the history of transactions would enable people to trust you. It's like a brand name, another reason for having only 1 identity. Encouraging 1 identity is key to prevent people from creating false identities with a bot in order to get more coin. The trick and difficulty is in preventing false identities in a way that scams the community.

Everyone should have a motivation to link to only real, known friends. That's the trick anf difficulty. I'm using "friend" very loosely. It just needs to be a known person. Like me and you could link to David Mercer and Zookoo, but we can't vouch for each other. That's because David and Zookoo have built up more real social credibility through many years and good work. They have sacrificed some privacy in order to get it. Satoshi could get real enormous credibility through various provable verifications and not even give up privacy, so it's not a given that privacy must be sacrificed. It should be made, if possible, to not give an advantage to people because they are taking a risk in their personal safety.

The system should enable individuals to be safer, stronger, etc while at the same time advancing those who advance the system. So those who help others the most are helped by others the most. "Virtuous feedback". This is evolution, except it should not be forgotten that "help others the most" means "help 2 others who have 4 times the wealth to pay you instead of 4 others with nominal wealth". So it's not necessarily charitably socialistic like people often want for potential very good reasons, but potentially brutally capitalistic, like evolution.

It does not have to be social network, but it does seem likable social people would immediately get more wealth. It's a transaction + reputation + existence network. Your coin quantity is based on reviews others give you for past transactions (social or financial) plus the mere fact that you were able to engage in economic or social activity with others (a measure of the probability of your existence). There have been coins based on trust networks but I have not looked into them. It's just the only way I can think of to solve the big issues. If the algorithm can be done in a simple way, then it's evidence to me that it is the correct way to go. Coins give legal control of other people's time and assets. If you and I are not popular in at least a business sense where people give real money instead of "smiles" and "likes" like your brother, why should society relinquish coin (control) to us? The "smiles" might be in a different category than the coin. I mean you may not be able to buy and sell likes like coin. Likes might need to be like "votes". You would get so many "likes" per day to "vote" on your friends, rather than my previous description of people needing to be "liked" in order to give likes, which is just a constant quantity coin. Or maybe both likes and coin could be both: everyone gets so many likes and coins per day, but they are also able to buy/sell/accumulate them. I have not searched for and thought through a theoretical foundation for determining which of these options is the best. Another idea is that every one would issue their own coin via promises. This is how most money is created. Coin implies a tangible asset with inherent value. But paper currency is usually a debt instrument. "I will buy X from you with a promise to pay you back with Y." Y is a standard measure of value like the 1 hour of laborer's time plus a basket of commodities. Government issues fiat with the promise it buys you the time and effort of its taxpayers because it demands taxes to be paid in that fiat. This is called modern monetary theory.

So China sells us stuff for dollars, and those dollars gives china control of U.S. taxpayers, provided our government keeps its implicit promise to not inflate the fiat to an unexpectedly low value too quickly, which would be a default on its debt. So your "financially popular" existence that is proven by past transactions of fulfilling your debt promises gives you the ability to make larger and larger debt promises. How or if social likes/votes should interact with that I do not yet know. But I believe it should be like democratic capitalism. The sole purpose of votes is to prevent the concentration of wealth, distributing power more evenly. This makes commodity prices lower and gives more mouths to feed, and that enabled big armies, so it overthrew kings, lords, and religions. Then machines enabled a small educated Europe and then U.S. population to gain control of the world.

Saturday, July 15, 2017

Best difficulty algorithm: Zawy v1b

# Zawy v1b difficulty algorithm 
# Based on next_diff=average(prev N diff) * TargetInterval / average(prev N solvetimes)
# Thanks to Karbowanec and Sumokoin for supporting, refining, testing, discussing, and using.
# Dinastycoin may be 3rd coin to use it, seeking protection that Cryptonote algo was not providing.
# Original impetus and discussion was at Zcash's modification of Digishield v3. The median method
# Zcash uses should be less accurate and should not be needed for timestamp error protection.
# Wider allowable limit for difficulty per block change provides more protection for small coins.
# Miners should be encouraged to keep accurate timestamps to help negate the effect of attacks.
# Large timestamp limits allows quick return after hash attack. Needed to prevent timestamp manipulation.
# (1+0.693/N) keeps the avg solve time at TargetInterval.
# Low N has better response to short attacks, but wider variation in solvetimes. 
# Sudden large 5x on-off hashrate changes with N=11 sometimes has 30x delays verses 
# 20x delays with N=17. But N=11 may lose only 20 bks in 5 attacks verse 30 w/ N=17.
# For more info: 
# https://github.com/seredat/karbowanec/commit/231db5270acb2e673a641a1800be910ce345668a
#
# D = difficulty, T = TargetInterval, TS = timestamp, TSL = timestamp limit

N=17;  # can possibly range from N=4 to N>30.  N=17 seems to be a good idea.
TSL=10 if N>10 else TSL = N; # stops miner w/ 50% from lowering  D>25% w/ forward TS's.
current_TS=previous_TS + TSL*T if current_TS > previous_TS + TSL*T;
current_TS=previous_TS - (TSL-1)*T if current_TS < previous_TS - (TSL-1)*T;
next_D = sum(last N Ds) * T / [max(last N TSs) - min(last N TSs] / (1+0.693/N);
next_D = previous_D*1.2 if next_D < 0; 
next_D = 2*previous_D  if next_D/previous_D > 2;
next_D = 0.5*previous_D  if next_D/previous_D < 0.5;

Monday, July 10, 2017

Doing better than the simple average in cryptocoin difficulty algorithms

I am still trying to find a better method than the simple avg, but I have not found one yet. I am pretty sure there is one because estimates of hashrate based on avg(D1/T2 + D2/T2 + ....) should be better than avg(D)/avg(T) if there is any change in the hashrate during the averaging period. This is because avg(D)/avg(T) throws out details that exist in the data measuring hashrate. We are not exactly interested in avg(D) or avg(T). We are interested in avg(D/T). The avg(D/T) method does not throw out details. Statistical measures throw out details. You don't want to lose the details until the variable of interest has been directly measured. I learned this the hard way on an engineering project. But avg(D/T) does not hardly work at all in this case. The problem is that the probability distribution of each data point D/T needs to be symmetrical on each side of the mean (above and below it). I'm trying to "map" the measured D/T values based on their probability of occurrence so that they become symmetrical, then take the average, then un-map the average to get the correct avg(D/T). I've had some success, but it's not as good as the average. This is because I can't seem to map it correctly. If I could do it, then another improvement becomes possible: the least squares method of linear curve fitting could be used on the mapped D/T values to predict where the next data point should be. All this might result in a 20% improvement over the basic average. Going further, sudden on and off hashing will not be detected very well by least squares. Least squares could be the default method, but it could switch to a step-function curve-fit if a step-change is detected. I just wanted to say where I'm at and give an idea to those who might be able to go further than I've been able to.

Numenta's CLA needs 6 layers to model objects

posted to numenta forum
====
Back when there were only 2 white papers and a few videos I became interested in the HTM and saw a video of a 2D helicopter being detected and wondered about the relation between the layers they used and the ability to recognize objects. I remembered 6 equations with 6 unknowns (the degrees of freedom) are required to solve the dynamics of 3D rotation and translation. The layers of the helicopter HTM matched with what it was able to detect if they were unknowingly being used in a subtle 2-equations and 2 unknowns methodology. Of course this begs the question "Are the 6 layers in the cortex required to see the 3D world?" Numenta's view of the cortical column implies that the 6 layers have nothing to do with this but I would like to question that view. Jeff has also warned against pursuing the reverse black hole question no one has ever escaped: "Is the 3D world the result of a 6-layered brain?" But an understanding of the relation between mass and space-time prevents me from abandoning the reverse question. More importantly, physics has an elephant in the room that is rarely acknowledged and questioned: the only integers that appear in physics are the result of 3D spacetime and Feynman states no fundamental aspect of QED requires an extension beyond 1D. QED is sort of the core of all physics except for gravity and nuclear stuff. An expert in the area informed me that spin is what creates 3D space, so my line of questioning is suspect. But my view is that we may have invented spin to maintain the view that objects are independent of our perceptions. I admit I am immediately deep in a recursive black hole: the 6 layers is a mass of neurons that I'm proposing we can see only because we have the 6 layers. BTW, if we had 10 layers to support the perception of 4D objects in 4D space then I believe all velocities would be static positions and all accelerations would be velocities. instead of E + mc^2 = 0 we would have E+mc^3=0 (now really getting side-tracked on the physics: by keeping relativity units correct there is a missing negative in some equations. Another example is F+ma=0 where the "F" is more correctly defined as the reactive force of the object which is in the opposite direction of the "a". This comes from meters=i*c*seconds which comes from Einstein's "Relativity" appendix 2 which he stated allows use of Euclidean instead of Minkowski space-time which is in keeping with the Occam's razor requirement.)

What I'm suggesting is falsifiable. Others posting here will know if it takes 6 layers to fully recognized objects in 4D space time. The degrees of freedom is N translational plus N(N-1)/2 rotational. I tried testing the theory via observation and thought of ants. It seems to be supported there: their eyes that need to detect only 2D "shadows and light" without rotation have roughly two layers. And yet their feelers and front legs, having to deal with 3D objects in 3D space, have 6 layers. There's a great extension to this observation: wasps are the closest cousins to the ants and have 6 layers for their eyes.

I posted this question nearly a decade ago in the old forum, but I'll ask again. Is a 6 layer HTM required for fully characterizing 3D objects in 4D space-time?
=====
I think a single layer would require a lot more new training on every object. For example, it sees a circle moving about and learns its behavior. Then it turns sideways and turns out to be a cylinder, and then it starts rotating, so training has to start over. I don't think it could conceive very well "this is the same object" and/or generalize the lessons learned on past objects to future objects. It just seems like it would have difficulty understanding objects like we do. I believe 6 layers would be able to perceive the laws of dynamics but 1 layer would not. These six layers are not an HTM but the foundation of a single cortical column. Each CLA layer of the HTM would require the 6 layers. So the CLA would need to be redone if you want it to think like mammals and see like wasps. The motor control of layer (5th layer of cortex) may serve may also serve part of this "inherent object modelling", not just motor control. The motor control part might be crucial to developing the concept of inertia (mass). Mass is another variable ("dimension") which implies 7 layers should be present. To get out of that mathematical corner, I have to conjecture mass is something special in the modelling like "the higher dimensions that 6 layers can't model and that have permanence".

I do not mean to say that 6 layers is necessarily inherently needed in A.I. to be superior to humans even in the realm of understanding physics, but that it is needed to think more directly like animals. But if 6 layers per HTM layer is actaully needed for a higher intelligence, then 10 layers to do 4D space should be even more powerful. 15 layers are needed for 5D. I do not accept the conjecture that objective reality, if there is one, depends on a specific integer of spatial dimensions like "3".

The visual cortex by itself with its 6 layers does not seem to have any concept of objects, but I think the 6 layers are still needed for encoding the information so that the concept of the objects is still extractable by the higher levels in the "HTM" of the brain (e.g. frontal lobes). But the concept of an object seems to be possible in the 6 layers just "behind" the eyes of flying insects: wasps certainly have a better concept of the object nature of people than ants, judging by the way they identify and attack. Ants are virtually blind to what people are, except for detecting skin and biting.

Saturday, July 8, 2017

Stars as cryptocoin oracles: posts to HUSH cryptocoin slack

Note: ethereum time syncs with pool.ntp.org:123. Nodes (mining or not) must have an accurate time to sync with network. Miners need accurate time so later blocks will build upon theirs. But there is no distinct rule on timestamps in ETH except that it must be after previous timestamp.

pools with >51% can get all the coins they want from small alt coins in a few hours, dropping the difficulty at the rate of next D = previous avg D x [1/(1+M/N)]^(2X-1) where X is percent of hash power, N is the number of blocks in the rolling average, and M is the coin's limit on how far the timestamp can be forwarded. If GPS isn't good enough, the only solution I can think of is to tie miners and/or nodes to the stars with an app on their smartphone to get a periodic observation of the stars to calibrate their clock. But then it begs the question (via the BTC white paper) of why mining would still be needed.
===
I think the point of mining was to solve the double-spending problem without relying on a 3rd-party timestamp. Satoshi seems to say this explicitly in the whitepaper. It also finances the growth of the network in a way that supports transactions, but I do not understand why non-mining nodes seem to be necessary to keep miners in check and/or why mining often has the feel of a necessary evil, if the entire point of financing mining was to build a working network. With a valid clock on each peer, the double spending problem seems solved without mining. It leaves the question of how to release the coins in a way that supports the network. But if the timestamp problem is solved by each peer using the stars as his clock, is there any need for a behemoth network using might is right to determine the time and thereby coin emission rate? It might be that peers with valid clocks who only want a wallet and to conduct transactions could be all that is needed reaching the ideal of not having any centralized miners or developers and absolutely evenly distributed among everyone. There might be a way to distribute the blockchain so that they do not all need the entire chain. It would have a statistical chance of forking (fracturing with all forks being valid but increasingly incompatible) which could be increased by hacking, but that would only result as the need for the network grew (via more marketplace transactions). So the fracturing might be beneficial by keeping the ideal of constant value. That is a requirement of all good currencies: constant quantity is the ideal asset, not currency. Constant quantity was always a disaster for all currencies that have ever been used because it's a bonanza for the 1% such as us, the early adopters seeking to profit without working for it, extracting wealth from late-adopters. In any event it would get rid of centralized developers and centralized mining. It might be as simple as PGP so that a requirement for a transaction to be valid is that the code never changes. Or maybe any code on any machine would be valid as long as other peers confirm your outputs are valid for your inputs as specified by a non-changing protocol.
===
by "fracturing" I introduced vagueness to mean "something that is probably not unlike forking". I am speaking of big picture ideas as I have no knowledge of BTC details. I took a strong renewed interest in difficulty algorithms after two cryptonote coins adopted my difficulty algorithm (block averaging instead of medians for 17 blocks with appropriate timestamp limits) to gain protection against attacks. Cryptonote insanely is (or was) using 300 blocks as the averaging window so sumokoin and karbowanek had to fork and start using mine. Zcash changed their digishield v3 as a result of my pestering but did not follow me exactly like these other coins. I posted too much and made a big mistake. I'm side-tracked: an unavoidable problem in the difficulty algorithm lead me back to the Satoshi white paper and the idea that scientific observation of stars could be the beginning of "real" cryptocurrencies as it was for physics. The stars would be the first valid, provable, non-3rd party oracle in cryptocoins.
====
With only +/-2 degree accuracy I figure 10 minute blocks are OK. 2 degrees is 4 minutes if you look at stars 90 degrees to the north star. So local peers have to agree on the time +/4 minutes with 1 minute to spare on each end. Russia also has a GPS system but I don't think the combination of the two solves anything.
===
You are saying I'm missing the "might is right" aspect. But the idea is that it replaces "might is right" with an objective verifiable truth that can be checked by any and all peers at all present and future times.
====
I think everyone could reject the transaction if it does not have the correct timestamp. He can lie about it, but it will be rejected. He can send the same coin twice in the same 8 minute window, but everyone is supposed to rejected both sends. I previously mentioned maybe all the peers do not need a full chain, but that's probably a pretty wrong-headed idea.
=====
Having 1 miner timestamp a block is a lot more important than having the correct time. But if a correct time is agreed upon, then every peer everywhere receives and validates every transaction independently. Because of the inaccuracy of the timestamps, the timestamps are rounded to the nearest minute that has 0 as the right hand digit, and you have +/- 2 minutes from the next "5" minute to send a transaction. But I must be missing something. It seems like using star gazing, GPS, or timestamp servers is not necessary: you would just need to make sure your peer's computing device has approximately the correct system time for global time.
===
I gave solution that doesn't even need an app that calibrates with the stars: if everyone manually makes sure their clock is +/- 2 minutes correct, and if transactions can propagate to everyone in 2 minutes, then let's say the blockchain is updated every minute that ends in "0". The blockchain would be updated by EVERYONE. There are no nodes or miners needed or wanted in this design, especially since we want it nuclear bomb proof, unlike the current bitcoin with concentrated miners and devs. Everyone would send out their transactions with their timestamp at minutes ending in "4", so with error, they may be sending them out right after "2" up until "6". If there is a 0 to 2 minute propagation delay, everyone's going to receive each other's transactions between "2" and "8" by their own clock (between 4 and 6 by "star time" or by whatever clock each peer has decided by himself to trust..it must not be coded into the client as a default unless it is watching the stars). On minute 8, every client closes his ears to every transaction. So nothing is happening on any client anywhere between 8 and 2 except verifying and adding transactions to the chain, which should work even if their clock is in error by +/- 2 minutes. Clients with a -2 minute error clock and those with a +2 minute error clock should see the exact same set of transactions, or someone is giving a mixed message to clients on accident or on purpose by going outside it's own allowed window. On accident would mean some transactions were missed on some clients. On purpose would be someone trying to spend on -2 minute clients the same coin he is also trying to spend on an +2 minute client. In both cases, it seems like clients could check each other and decide to throw both erring transactions out. So that's my proposal. If it's possible to implement, then as far as I know it's only 1 of 3 known ways. The first is a traditional database that has a single reference location for its core data so there are no "double conflicting updates" on the same record. In the case of more than 1 core location and backups, I believe they have advanced methods of checking for conflicts and then undoing "transactions" in order to correct the problem. The 2nd is Satoshi's method.

Wednesday, June 28, 2017

Zawy v2 difficulty algorithm

#!usr/bin/pseudo_perl
#
# Zawy v1b abd v2 difficulty algorithm
# Simple averaging window with option to use dynamic window size.
# Cite as "Zawy v1b N=8" if N=8 is chosen and "Zawy v2 N>=8" if dynamic option is chosen
# Credit karbowanec and sumokoin for using modifications of Zawy v1 after their hard forks 
# to protect against attacks that were the result of Cryptonote's default difficulty algorithm. 
# And for motivating me to do more work where Zcash left off.  
#
# Core code with fluff and dynamic option removed:  (TS=timestamps)
# TSL=10 if N>11 else TSL = N-2; # stops miner w/ 50% forward stamps from lowering  D>20%.
#  current_TS=previous_TS + TSL*T if current_TS > previous_TS + TSL*T;
#  current_TS=previous_TS - (TSL-1)*T if current_TS < previous_TS - (TSL-1)*T;
#  next_D = sum(last N Ds) *T / [max(last N TSs) - min(last N TSs] / (1+ln(2)/N)
# next_D = 2*previous_D  if next_D/previous_D > 2;
# next_D = 0.5*previous_D  if next_D/previous_D < 0.5;
#
# Changes:
# A preference for low N and letting difficulty change rapidly for hash-attack protection. 
# Included option for dynamic averaging window (probably inferior to simple low N).
# Includes timestamp limit for timestamp manipulation/error protection. 
# Added an adjustment factor to next_D that is important at low N: 1/(1+ln(2)/N). 
# This is due to the median of a Poisson being ln(2) of the mean.
# A rejection of medians which do not help and cause error via lack of resolution, 
#  including the "bad timestamp" excuse for using it. 
# Rejected dynamic modification to maxInc, maxDec, and TS limits based on recent history
# of D (as a way to let D drop after an attack). It either leaves a security hole or does 
# not have an effect.  Avg(solvetime) is still > TargetInterval if there is a hash attack but 
# I can't find a solution that does not have equally bad side effects.  
#
# Miners/pools should be asked to keep their timestamps accurate or it will help 
# attackers and block release will be slower.
# See verbose text at link below for explanations (if this is not verbose enough)
# https://github.com/seredat/karbowanec/commit/231db5270acb2e673a641a1800be910ce345668a#commitcomment-22615466

# Begin setting constants for this coin

T = 240;   # Target interval
MinimumHashAttackDuration = 8; # Sets N. See text below.
timestamps_are_provably_correct = "no";  #  "no" if miners or pools are assigning timestamps
employ_variable_averaging_window = "yes"; # see paragraph below

# End setting constants for this coin

# Modifications to the logic below this point is not recommended.
# Trying to fix something usually breaks something else. 

# Set averaging window based on MinimumHashAttackDuration. 
# N=17 is working in several coins, but it still allows some large on-off mining to rapidly
# "steal" blocks at low difficulty for N/2, leaving constant miners with higher D for N, delaying
# subsequent blocks. N=12 is low but not unwisely low. May cause 3x delays post attack verses 15x for N=17.
# N=8 might be best for small coins still having "hash attack" trouble at N=12. 
# N=8 has only 47% more >4*T solvetimes than N=30.  
# Even 4 can work, even with timestamp errors, if the rise & fall in D is 
# symmetrically limited to 2x & 0.5x per block. 
# There is desire to have low N because hash attacks with 
#  off time >= N and P=on time <=N, I have:
# blocks stolen at low D = P x [1 -(1-X)/2 - P/2N ]
# notice how low N is the only way to reduce attack profit. Stating attack length as Fraction of N:
# blocks stolen at low D = NF x [1 -(1-X)/2 - F/2 ]

N1=int(MinimumHashAttackDuration);
if ( N1 < 6) { N1 = 6; } # due to the way I have TSL, there's more risk to <6. 

# Variable averaging window option:
# It will be smoother most of the time, but takes ~N/4 blocks longer to respond and recover from 
#  sudden hash changes. Choosing the smallest N instead of this option is probably best unless
# you have a strong desire for a smoother difficulty when HR is stable.  
# Precise technical description: 
# Trigger to a lower N if a likely change in HR has occurred.  Checks all possible windows 
# from N1 to 2*N1, linearly decreasing the likeliness from 95% at N1 to 68% at 2*N1 and 
# resets window to the lowest N that triggers. After keeping that N window for N blocks
# to let strange events flush out it raises N by 1 each block until another trigger occurs. 
# It can easily reach 4N as the window if HR is stable.
# This option seems to increase solvetimes by another (1+ln(2)/N) factor which is not
# in this psuedocode for simplicity.

Smax=2; Smin=1; # STDevs range away from mean that will cause a trigger to lower N.

# TS limit to protect against timestamp manipulation & errors. 
if (timestamps_are_provably_correct == "yes" ) { TSL= 10; }   # similar to bitcoin
# next line stops miner w/ 50% forward stamps from lowering  D>20% if N1 is low. 
# steady steady D from miner with X% of network hash less than 50% who who always
# forward timestamping to the max is: SS D= correct D x [1 -(1 - 1/(1+TSL/N1) ) * X]
# Miner with X>50% can drop D to zero w/ forward timestamps at the following rate:
# next D=previous D x  [1/(1+TSL/N1)]^(2X-1)

else { TSL=10 if N1>12 else TSL = N1-1; }

# The following are fail-safes for low N when timestamps have errors.
# Bringing them closer to 1.0 as in other diff algos reduces hash attack protection. 
#  Not letting them be symmetrical may have problems. Example:
# Using maxDec < 1/maxInc allows +6xT timestamp manipulation to drop D faster than -5xT
# subsequent corrections from honest timestamp can bring it back up.
# Bringing them closer to 1 is similar to increasing N and narrowing the timestamp limits,
# but these values should be far enough away from 1 to let low N & TS limits do their job.

maxInc=  2; # max Diff increase per block 
maxDec= 1/maxInc;  # retains symmetry to prevent hacks and keep correct avg solvetime

# End setting of constants by the algorithm.

# definition: TS=timestamp

#  Begin actual algorithm

# Protect against TS errors by limiting how much current TS can differ from previous TS.
# This potentially slows how fast D can lower after a hash attack. The -1 is for complex reasons.

current_TS=previous_TS + TSL*T if current_TS > previous_TS + TSL*T;
current_TS=previous_TS - (TSL-1)*T if current_TS < previous_TS - (TSL-1)*T;

if (employ_variable_averaging_window == "yes") {
     for (I=N1 to N) { # check every window that is smaller than current window.
        # create linear function: STDev decreases as I (aka N) increases to N
         STDevs = Smax-(Smax-Smin)/(2*N1 - N1)*(I-N1); 
         NE = (max(last I timestamps) - min(last I timestmaps)) / T; # expected N for this time range
         if ( abs(I - NE) > STDevs*NE**0.5 ) { N=I;  wait=I; } # This is the core statistics. It's correct.
     }
}
else { N=N1; } 

next_D = sum(last N Ds) *T / [max(last N TSs) - min(last N TSs] / (1+ln(2)/N); 
# the above is same as the following. Credit karbowanec coin for sum & max-min simplification
#  next_D = avg(last N Ds) * T / avg(last N solvetimes) / (1+ln(2)/N)

next_D = 1.2\*avg(last N Ds) if next_D<=0;  # do not let it go negative.

# Increase size of N averaging window by 1 per block if it has been >=N blocks
# since the last trigger to N. This flushes out statistical accidents and < N attacks. 
if (employ_variable_averaging_window == "yes") { 
   if (wait > 0)  { wait=wait-1; } # do not increase N yet
   else {  N=N+1;  }  # resume increasing N every block

# Do not let D rise and fall too much as a security precaution
next_D = maxInc*previous_D  if next_D/previous_D > maxInc;
next_D = maxDec*previous_D if next_D/previous_D < maxDec;

Argument that low N is best in difficulty algorithms and why dynamic averaging window is not a benefit

I can't recommend a switch from v1 to v2 (static N to dynamic N).  The smoothness gained by the higher N is not much:  surprisingly, the std dev of solve times increases only 5% from N=30 to N=8. The std dev of D goes from 0.18xD to about 0.45xD for N=30 verses N=8.  For N=8 this means 97.5% are less than D=1.96x0.45=2 times more than they should be) . Long story short (due to Poisson median being 0.693 of average): going from N=30 to N=8 means only a 47% increase in 4xT solvetimes.  The dynamic window does not capture this benefit:  those > 4xT solvetimes are exactly the statistically unlikely times that will trigger the dynamic window back to a lower N, canceling the primary benefit of it rising back up to large N.  It looks a lot smoother and nicer _**most**_ of the time when hash rate is constant, but the painful small N events are not reduced.

Tuesday, June 27, 2017

Cryptocurrency difficulty algorithm postulates


Here are my "difficulty algorithm postulates" I want people to consider before creating or changing a difficulty algorithm.
  1. For a given hashrate with gentle variation, the simple average below is the best algorithm: 
    • Next D = avg(past N Ds) x TargetInterval / avg(past N solve times)
    • For whatever reason, it needs an adjustment factor for low N to keep solve time on track and make D more accurate:  Next D x 1/(1+0.7/N).  
    • The N used for averaging past D must be set to the N used for past solve times.
    • Using median is not near as good as using average and there is no benefit to using median.
  2. A faster response to hashrate changes will come at a cost in solve time stability. This is not a bad thing. Use the lowest N you can tolerate to get the fastest response. Low N causes large non-attack solve time variation. Consider down to N=8 if hash-attacks are a problem.
  3. Limiting the rise and fall in the difficulty per block is similar to increasing N, but is much less accurate.
    • I place limits on the rise and fall to be equal to what I think possible only as a security measure. 
    • Placing limits on the rise and fall to block an event you do not want is denying the truth of the observation that you have asked the average to report.
  4. Enforcing asymmetric limits on difficulty and timestamp changes are risky.
    • There is a temptation to allow faster decreases than increases in the difficulty per block (that results from the average above) in order to get back to normal after an attack. This may help keep block emission rate on schedule and reduce normal miner loses.  But it also enables attacks to resume more quickly which might exactly negate the two benefits. Avoid this more seriously if the attacker is intelligent.  If timestamps are assigned by miners, forward-stamping (combined with this asymmetry) will make D begin artificially lower in the next attack, amplifying the original problem instead of helping it. But if the allowed increase and decrease in D is symmetrical, then a subsequent accurate timestamp that negates the previous bad timestamp will be able to get D back to its proper value.
    • Conversely, there is a temptation to allow faster increases than decreases in difficulty per block in order to dissuade on-off hash attacks.  This directly will slow block emission rate. It potentially increases normal miner loses if it does not actually dissuade attacks. Avoid this more seriously if the attacker is dumb. It better enables a malicious attack that is not interested in profit to drive the D up, or to drive it up for the purpose of causing future oscillations if the diff algo is unwisely advanced and complex..
    • Limiting the amount the timestamp can be ahead of time more than it can be negative is like allowing D to increase faster than decrease, with the same type of side effects.
    • Limiting the amount the timestamp can be negative is like allowing D to increase faster than it can decrease, with the same type of side effects.
    • Symmetrical in the above is not exactly linear because the median of a Poisson with mean TargetInterval (T) appears to be ln(2) x T = 0.693% of  T but I have not addressed this.
    • Timestamp limits: I believe the forward timestamp should be limited to +6x and -5x previous timestamp instead of my previous statements of +6x and -6x because the "expected" timestamp is 1x, so  +6x and -4x is mathematically required.  But I want a -5x limit in violation of perfect symmetry out of more fear of greedy +6x occuring than -4x accidents or maliciousness. Reminder: two -5x in a row could cause a negative difficulty if N=8, so there needs to be protection against a negative difficulty. 
  5. Despite 3 and 4, there may be a way to use them to enable D to return to normal more quickly in post-attack.  This is really needed because the avg solve time increases (delaying coin release) when there are a lot of large on-off instances because even with low N, D is getting stuck high in post-attack.
  6. There is no way to stop >50% miners from using the timestamp to make difficulty = 0.  This assumes there is not a trusted third party enforcing a clock (like ETH) which is in violation and Szabo and Satoshi mandates. 
    • Might is right. 51% power is truth.
    • 51% (or the trusted 3rd party) controls the clock which means they control coin emission rate.
    • Bitcoin uses > 50% consensus to certify not only single-spend transactions but also the time.
    • Fear of a hard fork may be what prevents miners from doing this overtly.
  7. Difficulty algorithms should not have momentum. Predictive algorithms that look at the slope of recent changes in D to estimate a future D are vulnerable to large on-off miners (and possibly even an accidental and unconscious consortium of miners in search of personal profit) who can force the algorithm into oscillations, turning on when D is low and is starting to rise, and off before it reaches a peak. This is the Derivative part in PID controllers. PI controllers such as the average of the past are safer.
  8. Algorithms that try to protect against specific attack behavior are inherently vulnerable. 
    • It should be assumed that protection against specific attacks is automatically leaving an unexpected hole.
    • If an opponent can see the strategy you've employed that assumes something beyond your scientific observations, he can change his plan of attack but you can't change your defense.
    • For example, if you choose a fixed N based on how long you expect attacks to last, the attacker may make the attacks shorter but more frequent. 
  9. Miners acting in their own best interests weed out weak coins, are the mothers of invention, and/or are encouraging adoption of a single coin. Each of these might be "good" instead of an "attack".
    • Item 6 may mean all coins that are not the largest for a specific type of hardware are destined for a limitation on their size (if not outright failure) that is more brutal than Zipf's law.  We currently see something like Zipf's law in cryptocurrency market caps but if item 5 is correct,  it might become 1/Rank^2 or worse in market caps instead of 1/Rank.  This enforces Satoshi's original vision that the largest coin's "might is right" will make it less subject to attack than its clones.