We're Only 16 Years Away From Creating Actual Cylons

Illustration for article titled We're Only 16 Years Away From Creating Actual Cylons

We're much closer than you think to the reality of a "mindclone" — a computer with the mental capacity of the human mind — says the Institute for Ethics and Emerging Technologies' Martine Rothblatt. We're "close enough to feel the bits and bytes of cyberbreath on our cheeks." Ooh, spooky.


Apart from the obvious question — what is cyberbreath, and don't they make a cyber-mouthwash for that? — I have to admit I'm a bit skeptical of Rothblatt's gung ho predictions. For one thing, she quotes Ray "Unlimited Rice Pudding" Kurzweil. For another, I'm not sure her understanding of Moore's law is quite rock solid. Here's how Intel describes Moore's Law:

Intel co-founder Gordon Moore is a visionary. In 1965, his prediction, popularly known as Moore's Law, states that the number of transistors on a chip will double about every two years. And Intel has kept that pace for nearly 40 years.

And here's how Moore himself expressed it, in a 1965 article in Electronics Magazine:

The complexity for minimum component costs has increased at a rate of roughly a factor of two per year (see graph on next page). Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer.

Here's how Martine Rothblatt interprets it:

For example, my one year-old computer has about 1/100,000th of the capability of a human mind (its processing speed is about that fraction of the number of human brain neural connections, although its software is in some areas pretty advanced). In other words, it has only .001% of the capability of a human mind. It's a rodent. I could go buy a new computer today that has 2/100,000th or .002% of the capability of a human mind. At this rate, with the way my linear mind works, I would expect to be able to buy a mindclone in 99,998 more years. What, me worry! Our linear minds take our most recent experience – such as going from a 1/100,000th of a human mind computer to a 2/100,000th of a human mind computer in one year – and extrapolate it forward such that we think it will take 998 more years to get 1% of a human mind, another 1000 years to get to 2% of a human mind, another 1000 years to get to 3% of a human mind, and so on.

In fact, though, information technology does not grow linearly, but exponentially. This means, according to "Moore's Law", information technology doubles each 1-2 years – something very different from growing linearly. Because computer capability doubles it means next year I will get not 3/100,000th of a human brain computer, but 4/100,000th of one. Exponential growth means the year after that I will get not 5/100,000th of a human brain computer, but 8/100,000th of one. With information technology, I can expect to reach mindclone computing as rapidly as this:

Years From Now Fraction of a Mindclone
Next Year 4/100,000th
Year After 8,100,000th
Third Year 16/100,000th
Fourth Year 32/100,000th
Fifth Year 64/100,000th
Sixth Year 128/100,000th
Seventh Year 256/100,000th
Eighth Year 512/100,000th
Ninth Year 1000/100,000th
Tenth Year 2000/100,000th
Eleventh Year 4000/100,000th
Twelfth Year 8000/100,000th
Thirteenth Year 16,000/100,000th
Fourteenth Year 32,000/100,000th
Fifteenth Year 64,000/100,000th
Sixteenth Year 128,000/100,000th = MINDCLONE

Three clarifying comments are in order. First, the rounding down from 1,024 to 1,000 in the ninth year is just to make the arithmetic easier to follow. Second, while Moore's Law says that the doubling occurs every 1-2 years, in the example given above I showed the doubling every year. The effect of making it every two years would simply be to postpone mindclones to 32 years from now instead of 16, or to 24 years from now if we use a doubling period of every 18 months. The important point is that mindclones are around the corner – not in some other millennium, or even in some other generation. This is about our lives.


I love the way her little explanation goes: "Year sixteen: MINDCLONE." So there you have it. We have exactly sixteen years before Skynet nukes us all into the stone age. [IEET]


Chip Overclock®

I'm not buying it. We've had a good right up to just recently, and I'm sure there's more advancements to come. But we're already having to deal with tough scalability issues that we are having trouble solving. Clock rates have stalled, growth in chip densities is slowing, power consumption and heat dissipation is a big problem, and the trend toward multicore architectures puts us back in the 1980s where we were trying to figure out how to program massively parallel machines while the Japanese were preaching their Fifth Generation project. Like images of flying cars with tail fins on magazines in the 1950s, I'm sure the future will be totally unlike anything we can imagine, and not simply an extension of where we are today. There's a reason we're not riding in supersonic stage coaches.