Tuesday, March 4, 2014

Optimal control by direct numerial solution

Sometime I want to learn this stuff ( http://www.schwartz-home.com/RIOTS/ and http://www.sbsi-sol-optimize.com/index.htm and https://en.wikipedia.org/wiki/Bellman_equation )  and see how it might be combined with compression/pattern recognition techniques to make A.I. A smart machine seems to need the following:

1) desire (setpoint)
2) environment
3) modeling/predicting/compressing 2)
4) identifying which inputs (or levers inside the model) can be controlled for maximizing 1)
5) controlling the identified inputs or levers for maximizing 1)

Item 5) is called "optimal control", the goal of the links above.  Item 4) needs to be defined before 4 can be applied.  This seems to be what's missing in automated A.I. that seeks to maximize profit from a general environment.  It seems 3) and 5) have a lot of research behind them.   Maybe evolution (competition for max energy sources) has preprogrammed animals  to possess 4).  But humans seem good at taking the generalization a step higher.  The modeling part, if HTM/CLA methods are correct, needs nested prediction competitors, hierarchical.  It's interesting that SNOPT in the 2nd link above uses sparse techniques to solve non-linear optimization problems, and that the Bellman method solves the problem working backwards from the endpoint, and that the cost function that needs to be minimized is a Lagrangian and Hamilton's name is also attached (Bellman's equation is the discrete verson of the Hamiliton method).

Saturday, March 1, 2014

Brain to CPU comparison

I want to compare energy-efficiency of today's computers to brains.  Looking at today's CPUs, I see 22 nm process for the Intel Haswell processor, having 1.4 billion transistors in 177 m^2 die size at 35 W max at 3 GHz.  So there are 4 transistors needed per NAND gate.  A neuron is 100 Hz max, with about 10,000 synapses. Each synapse is much more complex than a NAND gate but I'll consider all the advantages of CPU like 50 times more transistors firing at a time because more neurons are always in standby mode thanks to sparse encoding discoveries. But most significantly (but related to the sparseness) is that the brain memory is within the brain, whereas the CPU offloads it.  This hurts the speed of the CPU and doubles power consumption cost.  There are 100 billion neurons per brain operating at about 35 W max. 

Today's desktop CPU:  1.4 billion transistors / 4 trans per NAND * 3 GHz = 1 E18 comparisons per second

Brains:  100 billion * 100 Hz * 10,000 = 1 E17. 

So computers seem to be very roughly 10 times more efficient as brains in terms of energy use, and exist in a much smaller package.  Humans cost about $100,000 to raise to the age of 18, whereas the CPU is $200 dollars, and can be focused to a specific task and changed to any other specific task immediately, without making errors and working about 20 times more per week. 

The brain is a reactive pattern recognition system, plus an optimization seeker who's method of operation is a mystery, very far from a CPU, so the comparison is difficult, but the CPU can use the internet as its memory.  Even a small hard disk can accurately retrieve every biography ever written, along with the DNA sequence of every author who wrote those books.