Tech. Science. Culture.
We may earn a commission from links on this page

The One Way Your Laptop Is Actually Slower Than a 30-Year-Old Apple IIe

We may earn a commission from links on this page.

Have you ever had that nagging sensation that your computer was slower than it used to be? Or that your brand new laptop seemed much more sluggish than an old tower PC you once had? Dan Luu, a computer engineer who has previously worked at Google and Microsoft, had the same sensation, so he did what the rest of us would not: He decided to test a whole slew of computational devices ranging from desktops built in 1977 to computers and tablets built this year. And he learned that that nagging sensation was spot on—over the last 30 years, computers have actually gotten slower in one particular way.

Not computationally speaking, of course. Modern computers are capable of complex calculations that would be impossible for the earliest processors of the personal computing age. The Apple IIe, which ended up being the “fastest” desktop/laptop computer Luu tested, is capable of performing just 0.43 million instructions per second (MIPS) with its MOS 6502 processor. The Intel i7-7700k, found in the most powerful computer Luu tested, is capable of over 27,000 MIPS.


But Luu wasn’t testing how fast a computer processes complex data sets. Luu was interested in testing how the responsiveness of computers to human interaction had changed over the last three decades, and in that case, the Apple IIe is significantly faster than any modern computer.

In a post on his website, Luu explains that he tested the time it took for him to press a key input on a keyboard to when that input appeared on the display in a terminal window. He measured using two cameras, one capable of shooting 240 frames per second, and another capable of shooting 1000 frames per second. While not as precise as if he’d had a machine doing the key input, his set up was still precise enough to give him a powerful understanding of just how slow newer computers have gotten. It is how he learned it takes 30 milliseconds for a 34-year-old Apple IIe to register an input on its accompanying display, and 200ms for a brand new PowerSpec g405 desktop with a 7th-generation Intel i7 processor inside to register an input.


Because Luu is a wonderful nerd, he got pretty granular with his testing. So he tested computers using multiple operating systems to see which introduces the most lag, and he tested some systems using displays with multiple refresh rates to see how refresh rate alters the lag. He found that those two variables pretty dramatically altered results.

With refresh rate, there was a consistent measurable difference, across computers. In his piece, he says, “at 24Hz each frame takes 41.67ms and at 165Hz each frame takes 6.061ms.” So a computer with a custom Haswell-e processor took 140ms on a 24Hz display, but just 50ms on a 165Hz display. The g405 with Linux on a standard 30Hz display took 170ms, but just 90ms on a 60Hz display.


The effect of the operating system was fairly pronounced, too. He folds operating systems into a broader problem he calls “complexities.” Modern operating systems, displays, and even keyboards have a lot more going on between the input occurring in a device and appearing on screen.

In the case of operating systems, it means newer OSes have more steps they have to go through in order to register an input. One example Luu provides is iOS—a key press on an iPad might need to take 11 steps just to register. A keypress on an Apple IIe running Apple DOS 3.3 has considerably fewer steps.


But the problem isn’t limited to iOS. In fact, as Luu’s numbers prove, they’re actually more culpable in operating systems that have to support a wider range of devices than what iOS has to support. This is why the Android devices Luu tested had more lag than the iOS devices, and why barebones operating systems like Chrome OS, Linux, and even the now ancient MacOS 9 exhibited less lag on the same machine when compared to Windows or MacOS X.

The important thing Luu notes is that complexity of the modern computer system might increase lag, but it isn’t necessarily a completely awful thing either. As one example, Luu points to the complexity of a modern keyboard versus the super simple Apple IIe keyboard.

A lot of the complexity buys us something, either directly or indirectly. When we looked at the input of a fancy modern keyboard vs. the Apple II keyboard, we saw that using a relatively powerful and expensive general purpose processor to handle keyboard inputs can be slower than dedicated logic for the keyboard, which would both be simpler and cheaper. However, using the processor gives people the ability to easily customize the keyboard, and also pushes the problem of “programming” the keyboard from hardware into software, which reduces the cost of making the keyboard.


It’s a tradeoff. Our computers now days are laggier, but they’re also able to do a heckuva lot more than a computer 30 years ago, and as Luu notes, there are opportunities to cut down on lag, particularly when it comes to the lag induced by slower refresh rates on devices’ displays. In the conclusion of Luu’s piece, he says, “we’re arguably emerging from the latency dark ages and it’s now possible to assemble a computer or buy a tablet with latency that’s in the same range as you could get off-the-shelf in the 70s and 80s.”

The evidence is all around us. MSI brought the GS63VR laptop to market, and it has an option for a 120Hz refresh rate, which means considerably less lag than a laptop with a standard 60Hz refresh rate. Razer recognized the role refresh rate has on lag and introduced a phone with a 120Hz refresh rate too. Even Apple launched a 10.5-inch iPad with a 120Hz refresh rate this year.


We’re slowly, product by product, coming into a new age, where our computers might start feeling as fast as that Apple IIe gathering dust in your parents’ attic. If you’re curious about Luu’s work and about computer latency in general, you can read more on his website. While his work isn’t as precise as what could be accomplished in a formal lab, it’s a great first step in helping people understand where, and how, lag is introduced into their computing.