So you want to build a computer in the 18th century. Is it even possible?
Probably not. Most people don't think about the actual amount of money and tools needed to produce exactly one transistor-based computer, power it, and program it, to say nothing of the social challenges you'd face trying to build this high-tech machine centuries ago.
Good luck getting the supplies
Today, having long invested in supplies like as a semiconductor fab, compilers, even something simple as a lathe (which for anything other than wood is a 20th century innovation), we can knock off millions of these components and so the individual price is in the pennies. But the incremental supply chain for semiconductor fabrication is astonishing in scope.
Creating a single homegrown transistor, though, is possible—transistor fabrication so simple a child can do it. Even there, it assumes the child has access to a Wafer of pure Monocrystalline Silicon. 99.9999 percent pure, with P or N dopant. Starting from scratch, you only need a precise oven at 1500C and a decent seed crystal and glass crucibles, as well as X-ray crystallography to determine the right plane to cleave the wafers. And, a couple of liters of purified liquid Argon, for inert atmosphere. No sweat. Better bone up on all the techniques you will need and memorize them, because smart people took many years to develop even one of those techniques from scratch.
But even if you assemble a handful of parts reliably, how many parts are needed to make even a basic four function calculator? It's a challenge of reliability that is best illustrated by ENIAC (or Electronic Numerical Integrator And Computer).
ENIAC contained 17,468 vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints. It weighed more than 30 short tons, took up 1800 square feet, and consumed 150 kW of power. Several tubes burned out every day.
Forget batteries. A lead acid battery at its optimum is 180 W/kg, so just to emulate power for ENIAC is 830 kgs of lead acid batteries You now have to mine and refine about a ton of lead ore and smelt it, as well as make (if you're lucky, and have a death wish) a fair amount of 30% sulfuric acid. Clearly a generator is the way to go, or finding vanadium for catalysis with the help of Jöns Jacob Berzelius. Or go with Phospho‐olivines as Positive‐Electrode Materials for Rechargeable Lithium Batteries—you already needed that x-ray crystallograph.
You'll still need a 760 amp battery for the generator, but that amount of lead and sulfuric acid should be sourceable in the 19th century, for lots of dough, without you having to reinvent smelting, having to set up shop in Sweden, or hunt down Lepidolite or brave the Government Junta of Chile for lithium ore. Although one ray of hope- a cheap semiconductor and CRT phosphor might be a byproduct of the waste of making zinc, and the locals have been doing that since 1750.
And a massive amount of machined parts to tight tolerances. No problem: you could try to convince Abraham-Louis Breguet to swatch out some fine precision machinery. As a polymath, you of course speak excellent 18th century era French and Swedish, as well as the dozens of languages needed to navigate between all the luminaries you might need to employ.
And a team of horses, to haul around the parts.
We also have an infrastructure for power supply that is hard to replicate without millions of dollars, along with teams of programmers to do the heavy lifting of coding. Expecting to go back to a time where the average person had less than a grade school education, you might put undue strain on the infrastructure of polymaths from that time.
The simple act of getting talent—not to mention infrastructure and supplies—from around the world to assist a skilled leader cannot be underestimated: the journey to "Silicon Valley" (then a fruit orchard) from the east coast in 1917 would at best taken 100 miles per day by light truck and in fact took 62 days at 58 miles per day due to breakdowns, road conditions, etc.
Remember, any project to create a computer is going to need raw materials, not just for the chips but also the tools to make the chips, the tools to make the tools that make the chip, turtles all the way down. You need to find, educate, house and feed an army of people to do manufacture, quality control, compiler and OS writing. Ada might help, if she wasn't distracted by phrenology and mesmerism, and financiers could be convinced to let a girl lead a team. It's not as simple as going to a dorm at Harvard in the 1700s/1800s and picking a handful of bright kids to help you. What you propose is radically unthinkable for the time.
Then, there's the money and leadership challenges. Even if you find one other person to help you, with what will you pay them? How will you convince them that being a "magician" is better than agrarian life, or their current positions in high society? Capital, both financial and human, for investment flowed at a much slower pace before globalization took hold in the 1870s. Anyone who argues against globalization from in front of a computer screen has misunderstood what it's meant for society.
Basically, we have very little perspective on what we would actually need to bootstrap an endeavor in basically Gilligan's Island-like conditions. You would expect that such an effort, even if technically possible, would not be socially or financially possible. Not like today, where vast pools of wealth and talent can be drawn on in a moments notice and flown in less than a day anywhere in the world.
But what if you did it anyway?
You can try to build a computer with discrete transistors experimentally for yourself: a reel of 10000 transistors will cost you about 300 bucks. (e.g. MMBTA06LT1G ON Semiconductor | Mouser) Build a computer, even with this leg up, today, with nothing but that head start. Or even with all of the other static components, like resistors, diodes, capacitors, power supply. And dozens of tiny lightbulbs for output, because Gallium for LEDs wasn't discovered until 1875.
The challenge is not the theoretical assembly, but the practical reality of large numbers. One bad chip in that reel, or one burnout, or one imperfect solder joint, and you're screwed. You cannot imagine the leg up we get from memory chips, PLAs, microcontrollers: it is simply not conceivable. As basic as it gets—the chemical elements needed for a computer might not have been discovered yet, no less the discrete logic concepts. What are you going to do, abduct the world's greatest minds and enslave them?
Then, try renting 2,000 square feet of clean and enclosed real estate to build ENIAC today either with tubes or transistors, and you will see the difficulty of finding space to build it. Millionaires don't just front any random stranger these kind of funds, and even today, assuming both exist, it is very hard to negotiate a fair business agreement for this kind of partnership.
Once you get the technical challenges behind you, you have the social challenges.
A lot can happen in 20 years. Kings rise and fall. Back then, diseases ran rampant, and you would have little in the way of pharmaceuticals to save yourself from a minor accident at the smelter.
You would probably, without any social connections or pedigree, be institutionalized if you tried to obtain the millions of dollars necessary to refine a little silicon, or even the metals and glass needed for vacuum tubes.
There'd also be discrimination and classism to contend with—god help the non-white putative polymath if they traveled back in time to 18th century Europe to make computers. Many brilliant inventors of the day faced the hard reality that society was largely fixed and immutable, and a "nobody" would have been sold into slavery or servitude quicker than given the upper-class access to scientific tools that a Newton might have had. Go back too far, and even knowing something like logarithm tables (John Napier) would be less interesting to most people than witchcraft and the occult.
One must also consider what a particular effort would cost. There wasn't enough unemployed capital available to start such projects. If just one part of that effort, say, mining resources, takes more capital than was available at the time. You can say, I'll go bet a bazillion gold coins on black, because I know the outcome from the future, you realize that someone has to lose that bazillion dollars to pay out, and that if the event you are basing the bet is truly random, then you don't know that the act of being there for the result would not perturb it to be otherwise.
The GDP of Britain/UK in 1700 was about $11 billion in 1990 dollars and $36 billion by 1820, the pride of the European world, and yet it was dwarfed by the GDP of China at the time. The entire world between those periods had between $371 billion and $694 billion 1990 dollars in GDP, with the overwhelming majority of that in agriculture. They spent to the limit that they could borrow, giving some indication of the available free float of money supply. They had so little free tradable gold to trade for items in China, that they had to use opium as money.
Forty percent of the world GDP (assuming if you could mobilize people to band together) would be about $250 billion in 1990 dollars. Intel has a market cap around $140 billion, so let's assume that the market cap of Intel is the sunk cost of trying to make affordable computer microprocessors. If you could get enough of the world at that time working in concert, you could easily make an Intel or two. But, that's a very big if, and is the only estimate I can see that even makes this argument plausible.
At $1200, that's three dollars per day. Consider world Gross world product today, which is 10 times that, or $30 per day.
It is reasonable that the only economy in the world which might have had enough slack to make the production of a computer happen would have been China. Around this period, though, China was in the process of internal revolt and world superpowers were divvying up the Qing Dynasty. There were about 300 million people, the size of the USA population today.
Total world population was around the 1 billion mark, and if you use the (optimistic, in my opinion) Chinese literacy test number worldwide, that's less than 10k literate people that would even be available as a resource without a significant effort to educate them. The remainder were likely devoted to producing enough food to survive, or craftsmen, and you have to question whether or not taking 15 years to provide enough education to a gang of "children" to work as a partner would be productive.
For instance, there were about 220,000 people in UK who had the right to vote (and were therefore minimally literate—or about a 3rd grade level) in 1790. It seems reasonable that the massive increases in the ability to read over the period were a likely cause for increasing pressure for the right of suffrage. But I would say if you wanted people who had the knowhow and education to be able to understand and act upon the ideas of the time traveller, 10k people worldwide in 1800 seems a likely number, with about 30 percent in China.
The ability to seize and capitalize on opportunity has a long history of treating genius or special insight with abuse, neglect, and distrust. Any scenario which requires not one, but miracle after miracle, to succeed swiftly reduces the chance that our aspiring time traveller would even be able to communicate their tales of wonder with the people of the past.
In this regard, study of such works as The Reception of H.G. Wells in Europe would be extremely important to the traveller, understanding how narrative in fiction is received is often indicative of the social minefield one will encounter. To say nothing of the effect of potential paradoxes—would you really want to be Biff Tannen?
Would It Be Possible To Build a Computer With 18th Century Supplies? originally appeared on Quora. You can follow Quora on Twitter, Facebook, and Google+.
This answer has been lightly edited for grammar and clarity.