The Smartest Advancements in Technology Series: The CPU

We may earn a commission from links on this page.

What five technologies have made our lives easier, better, and more efficient? Hard to choose, right? Well, inspired by the next tech advancement, Intel smart, we've done it for you. Last week we brought you the ATM. Today: the century-defining CPU.

The central processing unit—or CPU, for the acronym-adoring—has perhaps made more significant contributions to our daily lives than any other 20th-century invention. From personal computers to toys to medical equipment, the CPU has become an inextricable part of our existence. Any mechanized object that beeps, boops, has a screen, or accomplishes a task that you are incapable of completing without technology, probably houses a CPU somewhere in its core. Therefore, trying to explain the effects that the CPU has had upon our culture is like trying to give a brief answer about why the sun is important.

So here's a timeline of our favorite applications from the '70s, '80s, '90s and today:

The '70s

In 1971, the Intel 4004 became the first complete single-chip CPU and commercially available microprocessor. The Busicom 141-PF calculator was the first product to use the 4004, once and for all eliminating the need to remove one's shoes before figuring out complicated equations.

The '80s

Say what you will about the '80s, for electronic music fans, the decade rocked. Sequential Circuits' Prophet 5 synthesizer, featured in most songs played at the class of 1981's homecoming dance, was the first keyboard to utilize a microprocessor. The chip was able to detect when a note was played and then assigned each note to a voice using a voice allocation algorithm which created a polyphonic effect. And so a whole generation of pop stars were born.

The '90s

Although personal computers entered the market in the late '70s and became exponentially more popular in the '80s, it wasn't until the Clinton years that a computer in every home was added onto the chicken/pot, car/garage list of inalienable rights. Moore's Law, which in the 1960s predicted a constant expansion of CPU capability, took off and soon CPUs were faster, more powerful and—key word—cheaper than ever before. Submit to your '90s computing nostalgia by playing the original Doom, or plug in that dial-up and surf this archive of Geocities (R.I.P.) pages.

The '00s

Once people got over that whole Y2K thing and realized computers weren't (at least immediately) going to cause global destruction, the aughts were a pretty good era for the CPU. McDonald's brought the world its very first microprocessor-powered Happy Meal toy, all of a sudden our pockets were able to hold a thousand songs (and so, so much more). And just 40-odd years after the first silicon chip was invented, we are now able to implant them into human cells. From calculators to cyborgs in under two generations? Not bad, CPU. Not bad.

Where would humankind be without the CPU And we'll be asking ourselves that same question about Intel smart TV sooner than later, because the combination of Internet and television is surely the next in line in the Smartest Advancements in Technology all-time rankings.

Tune in later this week to see what the next revolutionary and utter indispensable item in the Smartest Advancements in Technology Series is. Hint: We'll never look at dancing silhouettes the same say.

And click here to find out more about Intel smart TV: where Internet meets television.