Tuesday, October 4, 2016

Resveratrol + DMSO for age spots, scars, and freckles

The following were results obtained by mixing Resveratrol (that was in a japanese knot weed concentrate) with DMSO and applying for 10 days, about twice a day.   The scar on top of head was 10 years old from scraping scalp off on an overhang ledge (in 2006). It had gotten worse than the 2013 picture and then I bumped it recently and it was swelling up with a scab that was slow in healing (see picture), so I applied DMSO with resveratrol.  The Feb pic was actually after 5 treatments and I stopped a few days later. I did not think much else about it, but you can see from the march picture that it finished healing on its on. It appears the DMSO/Resv kills the cells and stains them then it takes a month or more to clear out the dead/stained material.

The Mole/age spot/whatever (keratosis since unlike age spots it was a little bumpy?) on the side of the head started about 6 years ago and was beginning to look a little frightening, getting raised above the skin 0.5 mm.  I have a close up of it that shows how remarkable this is. 

Since I wrote the above I've tried it on several other moles and spots.  It does not always work, at least not after about 15 days of twice-daily treatments.  But I had some serious sun damage on my shin from old exposure and it had a remarkable effect there too.  I have picture of that plus the moles provably better (they might be better).  The spots on my shoulders have returned a little after a year, but they are still a lot better.




Thursday, September 22, 2016

Cryptocoins equally to all people w/o 3rd party OR transaction fee feedback to create constant value coin

Maybe there is a way to issue a fixed quantity of coin to all people on Earth without a 3rd party.

Your biomeasures are different kinds of "hashes" of your genes (and environment and chance). The following might work because single genes affect multiple systems. Given the right set of biomeasures it may not be feasible to generate a valid survivable human DNA sequence. One biomeasure constrains DNA one way, and another in another way, and so on. But given the biomeasures and DNA sequence the blockchain might prove a given pairing is valid. People would use the set of biomeasures and their DNA to apply to the blockchain for coins and a private key. DNA and private key would generate wallet addresses.

The key is that each gene usually affects multiple biometric measures, maybe in the same way a prime can be used to generate many different public keys when combined with other primes. Or maybe I should view the biometric measures as a hash of the genes. Either way, there seems to be a 1-way function that can be exploited. You can get biometrics from genes, but maybe not valid genes from biometrics.

Genes causing the expression of biometrics (genotype creates phenotype) is such a messy business (a huge and messy kind of hashing, not subject to strict mathematics and influenced by environment and randomness), traditional cryptography might not be usable. At first it might require a world class neural net to get started, then the blockchain would have to take over as the neural net. The neural net would take all available DNA and biometric data and find all patterns backwards and forwards (genes -> biometrics, biometrics -> genes) that it can. It would attempt to predict viable DNA from biometrics and vice versa. The vice versa (determining biometrics from genes) is relatively easy, but we are in its infancy. A lot of medical research is doing this because having a disease is a biometric result of the genes. But getting DNA from biometrics could be made very difficult if the right biometrics are chosen. A neural net could predict viable biometrics from DNA, but my thesis is that it could be really difficult to create viable DNA from a correctly chosen set of measured biometrics. The neural net's job is to discover the best biometrics to use (the ones it can't crack), and to constantly try to crack it. Successful cracks are rewarded. Along the way it is discovering what genes do as the preliminary step to cracking (it has to get its list of "primes"?).

Since population growth I think is around 2% and slowing, the inflation problem should be small, and even a benefit as I stated before, in contradiction to the usual cryptocoin beliefs concerning fixed-quantity coins.

It seems I am requiring people to apply for their coins using their biometric and DNA data before others get their DNA and generate viable biometrics.

BTW, a 3rd party is always present if the code can be changed at any time after launch. Developers being guided by users is the same as government being guided by voters. Lobbies like the rich or bankers (PoS and miners) that subvert the users' voting process is the same system we have for the dollar.  Observational evidence for this viiwpoint: we seek ethics in the developers in the same way we seek ethics in government leaders.

There is another way to achieve a constant-value coin that is a lot less difficult than using DNA, but does not retain the virtue of blocking machines out of human economics. **Let the market-determined transaction fees per coin determine the coin release-rate.** If the fee rises there is a shortage of nodes compared to daily coin transaction volume.  Additional fees per byte and a base fee per transaction would be needed, but not used to determine the coin release rate. This uses the velocity of money theory.  So the developers are not allowed (and not required) to decide the final quantity or release schedule of the coin. The market does.  A PID controller would take the transaction fee per coin as the input and output the coins per block.  If the fees drop too much, it indicates the coin is not being used much  and coins per block can go to zero, keeping coin quantity constant.  Miners would stop mining and nodes would live off the base fee for transactions.  Another controller would take the number of nodes per transaction as the input and drop the base fee and/or per byte fee if the ratio of nodes to transactions got unnecessarily high, which keeps the coin competitive and lean without oversight. The more feedback controllers used intelligently, the more intelligent the coin (and anything else) is.

 I am not saying the above is perfectly correct or complete. I wanted to show that some idea like it could create the cryptocurrency holy grail: a constant value coin not based on perception, opinion, miners, or developers.

Intelligent direction (i.e. controller feedback) of permission  (i.e. legal tender, aka currency) to use available resources is the basis of all intelligence. Be it glucose and molecules in the brain, energy and matter in economics, or CPU time (kinetic joules=expenses) and RAM/HDD space (potential joules=initial investment) in computing, the intelligent direction of the currency directs the energy and matter for personal profit (growth based on more and more energy and matter coming under control of the movement of the currency). Democracy uses the feedback of votes to guide the taxes which directs the energy and matter in government which a controller on the economics which gives voters what they want.   The most intelligence cryptocoin will be a growing, spreading, changing A.I.  of feedback controllers (smart contracts directing the coin) that enables the market place that falls under its control to be the most profitable and growing so that the cryptocoin itself can be profitable and grow by riding (lightly) on its back so that it is a symbiotic relation instead of viral/cancerous.  The end goal is congeal the matter on Earth into a more ordered form, releasing entropy to the universe. We are doing this by shifting from organic bonds to metal and metalloid bonds, removing oxygen from metals, silicon, and carbon so that we have greater control through lower entropy per kg of our economic machine. Earth's unusual because of the order injected by the Moon, and why we look for life on Titan and Io (geological disturbances are cyclic forces that inject order into thermodynamically-stable systems).

The market itself is just a bunch of feedback going on between agents, under the rules of some governing coin (i.e. legal tender).   So ideally, the feedback systems would probably be nested and complicated from bottom to top so that the distinction between government and market is not clear, while the coin would be very clear.  Separate "organs" of law (code) could easily have their own internal coins, but still be based on a system wide coin. Maybe the highest level coin describes the boundaries and definition of an entity. The highest I know of is energy (Gibbs free energy). Maybe there is some sort of negative entropy that could be higher.  But a single coin and system without distinguishable "organs" should be the most efficient, like a highly compressed algorithm.

But for current work on cryptocurrencies, it seems 1 to 5 feedback measures should be the limit.

There is currently no feedback from the market place  (other than the difficulty) to tell cryptocoins how the coins are to be issued in order to best benefit the market. The arbitrary nature of coin quantity, release schedule, and fees needs to be changed and connected to the coin's usage and computational power.
=====
Let transaction fee per coin control coins per block issued and never let difficulty fall. Problem solved? A base fee per transaction and fee per byte would also be needed. A standard PID controller on the transaction "error signal" would be used. Difficulty can easily get too high, but there is no incentive for attacks to make it go high because they can't profit on downturns. Large miners can't profit from random difficulty swings or manipulate it for profit. If difficulty is too high, miners will get out if fees are not high enough. But surviving this demonstrates the system is not a Ponzi scheme that will end when mining ends. A decrease in network hash rate might adjust the set point that the transaction fee error signal needs. With the right feedback (checks and balances) developers would not be required (or allowed) to choose any aspects of coin issuance (not total quantity, schedule, coins/block, difficulty, or fees). The market should be able to dictate everything without anyone needing to explicitly vote except by their marketplace choices (miners getting in or out, and transaction fees). If the market for the coin starts to dry up (it's fees were too high to sustain miners) then it merely shows a more efficient coin is taking its place, and it should dry up. But the quantity of the at the point is constant.

Friday, September 9, 2016

Ideal difficulty algorithms for cryptocurrencies

a post to github related to monero and zcash:

I've come to the conclusion that the best difficulty will be a simple rolling average:

next Diff = avg past N Diff * TargetInterval / Avg past N solve times.

The shorter the window average, the more protection against attacks, but there is more variation in solve times. This is unavoidable. There is a law written in stone: if difficulty is allowed to go down, you can have good protection or good solve times with a low standard deviation, but you can't have both. You have to choose how many blocks you want to "give away" by choosing the max time for say 10% of the block solves. Low block window averaging is higher protecting but wider swings in solve times. You could use N=5 for great protection if it is OK to have time to solve > 5x your target for 5% of the blocks. Once manipulators come in, you need to be prepared for 5x target 10% of the time. But such a short averaging window requires an accurate timestamp on blocks instead of miner generated times. Without that I would copy what Zcash is doing (N=17 window average with a median instead of mean for the solve times), except be sure not to use the 8% up and 16% down limits they are using, which I hope and suspect they drop before release. There is something weird with their method of getting the median that works better than the way I get the median, so us eit, which I guess comes from Digishield v3. But if you get an accurate timestamp, use the mean.

And low N averages have accidental spikes in difficulty and solve times. Miners can choose to come in immediately after those which makes the next difficulty and solve time spike even higher. so they can put it into oscillation for profit. But this might be a problem for all windows of even larger N.

The biggest protection against attacks might be to discover the methods and encourage and enable everyone to use them. That tends to block the profits of cheaters by actually leveling out the swings, helpig the constant-on miners. For example, in time warp attack is less and less useful if you initiate it and 10 people come in to take it away, splitting the profit. So maybe you shoulld give the code to enable everyone to do it. It might then become useless to everyone. Of you try to pick a bottom, but then someone comes in earlier so your bottom does not occur, and so on, until there is no bottom.

The only way I have found to get perfect protection against attackers (and fairness) and to have a perfect release schedule is to never let the difficulty drop but follow a slow steady rise, use a valid timestamp on solved blocks, and pay miners inversely proportional (Tim Olson's idea) to their solve time relative to the average time that is expected for the current difficulty setting. If a miner solves fast, he gets paid proportionally less. If he solves slow, he gets paid more. The coin release schedule stays as perfect as your clock, and there's zero profit from manipulations. The problem with a clock is that it is a third party. But it is not a problem if you're already using a subtle 3rd party going under the name of "trusted peers" who will set to a universal time clock. (The trusted timestamp also prevents timewarp attacks. ETH uses one.)

This has very important stable, real value implications. For example, miners are paid PER BLOCK for the amount of electricity needed, getting closer to the ideal of value=joules, not merely based on the average electricity expense per block expected. This requires abandoning the idea that blocks must be solved within a certain time frame. If the coin can survive post-mining on fees, then it should survive solve delays in the exact same manner to prove it can survive on fees ahead of time. But it may not result in substantial delays as everything is done so well.

This probably changes too much in bitcoin's core, and there are likely good reasons Satoshi did not do it. But it's best by starting with a known ideal and work backwards. In this case it means every time you let difficulty fall, you are harming constant-on miners relative to other types of miners.

Sunday, August 28, 2016

Xanthohumol in beer hops is a potent nilotinib-like tyrosine kinase inhibitor

Saturday, August 27, 2016

GPUs vs CPUs in Zcash

An 8 GB GPU could run 2.5 more threads at 8x more bandwidth by simply porting the CPU code (no special parallelizing needed). The problem with the Equihash paper is that it reverenced a 2011 GPU and did not point out that modern GPUs have a lot more on-board RAM.  So 2.5 x 8 = 20x is an upper limit.   But the cores are operating at 1/3 the clock speed of CPUs and my experiments in acquiring blocks on the testnest indicate core-caching and/or clock speed on the CPU matters a lot. Either way, it indicates less than 2.5 x 8, maybe 8x benefit as a minimum.  The important point is that this minimum is double the Equihash paper and it does not require any special programming that was required in the 4x claim of the Equihash paper.  The paper referenced a 2011 CPU for the comparison, so I did not think there was a problem in looking at an old GPU as both have advanced.  So the problem (if you wanted CPUs instead of GPUs) is that Zcash has chosen parameters that are good for 2011 but not for 2016.  I am not being critical as I did not realize the implications myself until now.  Even without the GUI, I could not get 4 threads to run good on 4 GB, and 6GB seemed to be even worse.  So 8 GB is the demand.  Since 8 GB is the demand, 750 MB/thread is not good.  1.2 GB should have been the requirement in order to allow ubuntu and to hold back GPUs.
=====
update to the above:
==========
The Equihash paper was from 2016.  The GPU vs CPU data was from 2011.  I wanted nothing more than CPUs to win, but an 8 GB GPU should be 10x better than a CPU at launch if they are no better than the stock miner.   The Equihash paper assumed the cryptocurrency developers would choose a RAM requirement that is higher than on-board GPU RAM.  But with new GPUs, a GPU coder can copy the stock miner and run it on 10 cores to get 2.5x more threads than a 4 core CPU at 20x the bandwidth (a $400 GPU).  It's not 20 x 2.5 = 50x faster than CPUs only because the GPU cores are so slow.  The 4x statement in the Equihash has nothing to do with this: by assuming the coin's RAM requirement would be larger than the GPU RAM, they assumed advanced parallel programming would be needed to make use of the GPU's many cores. That is not the case. Zcash was not able to force larger RAM, so the Equihash paper is not relevant as far as GPUs are concerned.  They might could make the RAM about 1200 MB per core if they go to 5 minute intervals. This would reduce the GPU advantage to 7.5  by my above math.

But we have not actually seen any $400 GPU results faster than a $200 desktop.

Thursday, August 11, 2016

Zcash and energy-expense in cryptocoins may not be a bad thing

I mentioned computers are using 4x more electricity when running Zcash. It may make GPU's less capable of competing. They are not able to access the external RAM directly, so they are less efficient, having to compute more per hash. The 4x speed of parallel code for GPUs of the future will come with at least 2x more energy cost.

From measurements on the testnet and my electricity use, if there are 50,000 PCs on the network, it will cost $1 per coin in electricity above 24 hour normal-use PC costs if you have a 22 nm process CPU that is meant to be efficient ( less than 3 GHz).  

Although the high energy use is against what most people consider a "good" coin, it might be an inherent necessity if POW is inherently needed. The high energy use is key to making mining widely distributed. If the only thing determining the quantity of coin obtainable is the amount of energy purchased, then most people have equal access. Electricity rates can vary a lot compared to oil (e.g. islands & remote villages), but that is a small portion of the world's population. If a government subsidizes cheap electricity by investment or allowing more pollution, then the populace of that region have paid the price that local miners gain. If they optimize code and have cheap electricity, they might get 5x more coin per dollar expense compared to small miners.

If Zcash spreads to the populace who do not have to buy equipment and do not even notice they have higher electrical costs, mining may not be feasible. This is the stated goal. This means a general-purpose CPU needs to be biased for. This means more electricity to stretch its general-purpose skills. Sorts seem very specific, but they bias everything towards a more general purpose Turing machine. The entire basis of a Turing machine is reading and writing, and sorts need it in ways that are hard optimize in a way that reduces the need to read and write.

The RAM of devices is not generally being wasted like CPU time, so it might be better to be be CPU-centric. But part of the path to the general-purpose CPU is high RAM in order to block out non-general purpose GPUs and ASICs.

So it's a coin that promotes generalized computing devices in everyone's hands without taking away too much RAM, rather than wasting money on specific equipment for specific people (miners). This is a 2nd reason a higher electrical expense is not a bad idea: CPU time is being wasted more than RAM space. And note that RAM is not electricity-free. There is a very large initial electrical expense in creating the RAM, as indicated by it's price. This indicates equal use of CPU and RAM may be better as one is an on-going time-based expense of Joules and the other is a one-time capital "space-based" expense of Joules. CPUs require a Joules per bit state change in time, and RAM requires a Joules construction cost per bit storage space in cm^3. Of course RAM has state-change energy cost and CPU has construction cost, but those energy costs are smaller.

All economic schools have said following a basket of commodities is the best currency. Those favoring gold do so only because it is the best single commodity that has the right properties. It also one of the most wasteful ways to use kinetic energy, which is key to its perceived value. A basket would require measurements and "paper" (bits on a computer). The cost of energy (like electricity) is the largest underlying factor in the cost of producing commodities. So currencies based on Joules have been proposed as ideal. Zcash is a Joules-based currency. The Joules-as-value, both kinetic and potential, has a deep connection with computation and life. (see Note at bottom).

There is a 4th "benefit" to a high electrical cost per coin, although all these points are connected. It should not sell for less than the cost to produce it, unless someone has given up on the coin and will accept a loss.

Zcash's goal is to be "democratic" in mining. The result is an ideal cryptocoin. POW should not be "might is right", but "distributed might is right". Otherwise, the ability of miners to lobby the governing coders becomes the wrong kind of might.

This is not to say an energy-intensive coin is best for society. A coin that is given based on how much a person helps society (such as Solar Coin) would be best. But that involves agreement on definition of what is "best" (are solar cells really the best use of your money to be subsidized by giving you a coin?) and then measuring it before the cryptography part can even begin. It is a type of fiat requiring a middle man (or at least a group of oracles that are doing an agreed upon measurement, governed by smart contracts that define the rules for the distribution use of a specific coin). The whole reason fiat replaced gold is because governments are able to print it and distribute it evenly based on achieving goals that are ostensibly best for that society. Coins distributed based on POW that is not connected with the betterment of society are not best unless the government is not acting in the best interest of people and/or anarchy (e.g., hyperinflation) is near.

Note: To be more correct, it is the Joules as measured by Gibb's Free Energy that is essential to life. Schrodinger even updated his "What Is Life?" book to point out that when he said "negative entropy" as being the key to life, he really meant free energy. Gibb's F.E. = U -TS where U is internal energy and TS is temp x entropy. In terms of Zcash, U=RAM+CPU capital one-shot "energy" expense and TS=CPU operating on-going energy expense. The first is energy stored in spacial RAM and CPU structures, and the second is energy spent in time. CPU computations come at an energy cost of TS/eff where eff is the efficiency of the device. This does not include algorithm efficiency. Per N bits that change state irreversibly, the number of Joules expended is T x S / eff (see Landauer's limit relating bits to physical entropy) where S=kb x ln(2) x N. For a given algorithm.

Thursday, August 4, 2016

Note to self: cryptocurrency difficulty settings

I estimated above that the difficulty setting's averaging of the past in order to determine if coin production is on track or off-track should be  3 x N / SQRT(N).  It's almost a perfect curve fit for a coin's expected deviation from a Poisson distribution, allowing for up to 3.75 standard deviations from the Poisson distribution's expected mean. This high level of permission allows for network hiccups away from the mean that someone could profit from if they can cost-effectively shift hashing energy around to different time slots. They'll be able to detect a deviation with decent probability each hour (N=30 rule) before the code decides in a difficulty change.
Poisson distribution with 3.75 std devs from mean:
3.75 x 2 e^-N x N^(N+1) / N! =~ 3 x N / SQRT(N)

If you want to minimize profit from hiccups, you could remove the 3.75 to allow for 1 std dev from the mean. The drawback is that this means that 1/3 of the time you will be intervening with a change in difficulty where none was statistically present, instead of ~0.01% of the time with 3.75.  3.75 is too permissive.

With the current method, the algorithm appears to be intervening too much with too-large changes that are too-often.  It seems like a nosy government regulator, acting beyond what the statistics requires. It is donating easy coins to slow transaction periods at the expense of the small business owner (miner), to the benefit of the more-talented, conglomerated smart businesses selling a service (shifting hash energy) to the slow transaction periods.  I would use 2 std devs instead of 3.75 as a nod to statistical tradition. The current code is using something like 0.1.  [edit correction: after N=6 intervals

It's not merely hiccups it's trying to fix, but also the distribution itself allows for a sparse period of transactions.  The Poisson distribution says the probability of k=1 occurrence in N=6 intervals (15 minutes) is (N L)^k / k! / e^(N L) = N/e^N = 1.5% where L = the average of 1/2.5 minutes, so NL= average occurrences in N x 2.5 min intervals.  So there will be an average wait of 1/0.015=67 intervals for it to take 15 minutes. It would take 30 minutes once every 23 days.