Brianjones seems to understand me. But from what I can tell from his comments, he and I might be in agreement as to what outcome awaits. It's a wide variety, but the end seems to be a great decrease in the number and importance of people within 200 years.
The agricultural revolution (people able to control plants and animals for food) led to people believing that people were somehow different than animals. Before then, paleolithic peoples were very much aware of the lack of difference between humans and other animals, and the theory of evolution was ingrained in beliefs rather than a surprise. People also viewed rival tribes as threats and had no shame in eating the losing people because it was good meat, same as chimpanzees. It is only after the agricultural revolution that slaves became something you might want after war and for that reason you didn't eat the losing side. The heightened view of human importance due to agriculture even prevent the eating of the dead after European battles. But anywhere protein was scarce (plant agriculture without animal domestication), the dead were a treasured source of food. Aztecs had up to 300,000 skulls hanging on lattices surrounding Tenochtitlan as the Indians in the area learned to farm too well too quickly and ended up with a surplus of humans and depleted sources of natural protein. Sea on 2 sides added additional pressure against mere expansion and other solutions, so that a state could form systematic cannibalism rather than the wild warfare only cannibalism that existed in north and south America that were the result of just having easy protein rather than specifically lack of protein. It's not surprising people used to eat each other. It's surprising Europeans at the end of battle did not.
My point is that human arrogance was invented as a result of domestication that made food energy (energy and mass) more easily available because the plants and animals are more efficient than us at capturing sunlight and converting it to protein, etc. This did not cause a "rise of the animals" over the humans, but it certainly increased the "importance" of animals to the "economic machine" if you could view humans as merely means by which these animals rise to dominate the world, i.e. there are 3 times more chickens than humans. We are just the way super-intelligent chickens acquire food and breeding rights. The mass of humans is more, but the mass of humans compared to what we harvest and kill is something like 0.1%. So we are just the brain for a massive symbiotic system of DNA-based life. But at some point physics and the observed history of the Earth demands that more efficient methods will dominate. There is nothing sacred about water-only, low temperature, low pressure chemistry (DNA life) when other methods are now available.
Our current silicon designs are limited only to the extent they are synchronous, which limits how big the circuits can be due to the speed of light. If they switch faster, then the circuit has to be smaller (10 GHz will have synchronous problems beyond 1/2 of the speed of light divided by 10 GHz which is 1.5 cm CPU chips...intel has 3.7 Ghz chips that are 1.5 cm). But if you cut switching speed in half, you can get 4 times more transistors in a chip area. I wrote an article 20 years ago that claimed 2015 was the limit of single-plane synchronous silicon following IBM's Bennet and Landau, due to heat flipping bits randomnly and the synchrony (light speed) problem, but that may not be true at slower switching speeds which allow bigger and cooler chips. Making it 3D has a heat problem, but it seems like you could many doublings in 3D with air gaps at a slower switching speed. 10 Mhz would be a 1.5 meter chip, with maybe 300 layers 0.5 cm apart with liquid cooling. 10 MHz is 100,000 times faster than neurons but with 5,000 times fewer connections (10,000 connections per neuron and 2 per NAND gate).
300 layers of 1.5 meter chips gives 1 quadrillion NAND gates with intel's 22 nm process (4 transistors per NAND gate). Brain is 100 billion neurons. OK, so Speed*connections*(number of comparison devices) gives current silicon technology as 200,000 times better than a brain. If brains and silicon are reaching the limit in how you can use ions and electrons (respectively), then maybe it is not a coincidence that this is within an order of magnitude of the ratio of the weight of the ions the brain uses to send signals to the weight of an electron that silicon uses, which I stated before (600,000). 200,000 assumes only synchronous. Parallel techniques can provide additional benefits.
Math correction: 10 MHz could allow a 15 meter die (not 1.5 meter) and 3,000 layers. So let's say it's a 100 MHz computer, giving 2 million times more power than a single human brain in a volume the size of a desk.
People continue to refuse to believe computers are (in theory) technologically superior to brains simply because of arrogance: to know that their animals and machines are superior to them in ways that matter (energy to move matter to make copies) is to lose their moral authority of having dominion over them, even to the point of abuse. Clearly the brain deserves more protection and authority in our symbiotic relationship with animals and machines, but soon we will not be the brain.