Yesterday the folks over at Vox published an article arguing that generations should be defined by the technology they use, rather than by age. They included a graph that purported to show how American society is "adopting new technology more quickly than ever before." The graph is garbage. And here's why.
[Update: It looks like Vox has heavily edited their original story by creating a new lede, referencing research by the FCC and Pew Research, and correcting factual errors like their conflation of the invention of the internet with the invention of the web. They've also swapped out their original "Technology Adoption" graph with a new one and added an additional graph with data from the FCC. None of these edits are noted in the piece. But they appear to be sticking with their original argument anyway, so the critique below still stands.]
[Update 2: Vox has yet again made changes (this time with acknowledgement) ditching the shoddy data from Singularity.com for FCC data which, while helpful, is still used to reach debatable conclusions about how we should define generations.]
Vox's graph is supposed to show how quickly different technologies like radio and the internet were adopted. They looked at the number of years it took those things to go from being invented to being used by 25 percent of the U.S. population. First off, this is a terrible way to understand the influence of technology on American society. But let's play along for a moment anyway, shall we?
Garbage Data In, Garbage History Out
The Vox graph in question:
Vox lists the invention of radio as 1897 and the invention of TV as 1926. We'll just assume that they're referring to one of Marconi's radio experiments in 1897 and the first public demonstration of TV tech by John Logie Baird in 1926. Listing these years as when they were "invented" is highly debatable, but again, let's play along.
The graph also shows things like the mobile phone as invented in 1983 and the internet as 1991. These years would be fine within the context of when those technologies became commercially available to a larger group of Americans. But radio and TV weren't commercially available in 1897 and 1926 respectively. Do you see the problem?
If you want to say that the invention of television occurred in 1926, then you don't get to say the internet was invented in 1991. The privatization of the internet in the early 1990s and the introduction of the web did indeed make it available to the broader public. But if we're playing by the same rules as radio and TV, we'd have to put the invention of the internet at 1969, when the first host-to-host message was sent from UCLA to Stanford, breathing life into the Arpanet. And as for the mobile phone, you'd have to peg its invention at something closer to the mid-70s, not 1983.
Revising "Invention"
So let's try making a graph with some of our revised "invention" years that's just a bit more consistent:
Well, what do you know? The graph doesn't show a progressively faster rate of technology adoption by the American public. What was once a clean graph that fit convenient and largely unquestioned ideas about exponential growth in tech suddenly becomes more complex.
But please don't go passing around this new graph either. Because it's nearly as worthless as Vox's graph as a way to understand the history of technology. Why would it matter how long a technology took to go from "invention" (a really messy and complex concept) to 25 percent adoption?
Fun With Arbitrary Numbers
If we really want to play this game, perhaps we can look at a different measure of adoption: from about 5 percent to 50 percent. To be clear, this is just as arbitrary as trying to pin down an invention date and seeing how many years it took to reach 25 percent adoption. But it feels like a slightly more honest way to measure tech growth.
When a technology is in about 5 percent of American households, this means it's still in the hands of early adopters, tinkerers, and the wealthy. Breaching 50 percent usually means that it's within the reach of the middle class. So what if we look at TV technology through this lens?
In 1949, just 2 percent of American households had a TV. By 1950 it was 8 percent. And by 1954, 59 percent of American households had a TV.
We see that TV went from being in less than 5 percent of American homes to over 50 percent in just five years. That certainly doesn't fit with Vox's depiction of TV's evolution. When looked at in this way, TV invaded America even more quickly than the internet did.
You see identical patterns with other technologies like radio. The percentage of American households with radio sets was in the single digits during the first few years of the 1920s. By 1930, roughly 40 percent of US homes had a radio.
Or take an invention that Vox doesn't mention, like the mechanical refrigerator. In 1930 just 8 percent of American households had one. By the end of the 1930s, many more Americans had ditched their old-fashioned iceboxes, and fridge ownership reached 44 percent. So in roughly a decade, the fridge went from rare to mainstream. And after World War II, they were everywhere.
Inequality in Tech Access
The Vox piece argues that as tech adoption is speeding up (even though it's not), it's useful to break up new generations into smaller segments.
Technology is improving more quickly, but its also being integrated into American lives much faster and more thoroughly than it was in previous generations, making these new generations themselves fragmented.
Sure, my iPhone might be a little nicer than the first one I bought 7 years ago. But how much has phone tech really changed since 2007 if we step back and view it historically? Enough to warrant splicing up sub-generations? And more to the point, 7 years after its introduction, how many Americans can afford a new iPhone?
What about tech access and the history of inequality? This is where Vox's larger argument about defining generations by tech adoption rather than age really falls apart — whether you're talking about 100 years ago or today.
You're probably familiar with the 1930s urban/rural divide that caused most Americans in urban settings to have access to electricity, while their fellow Americans in rural areas didn't. In 1907, just 8 percent of American households had electricity. In 1920, 35 percent of US homes had it. By 1929, 68 percent had electricity in their home. But what if we take out farm dwellings? Suddenly that number jumps to 85 percent of homes in 1929 with electricity.
Was the 40-year-old banker in New York with electricity during the 1920s of a different generation than the 40-year-old farmer in North Dakota without electricity during that same time? Is it at all useful to define generations this way?
All of the technologies that Vox lays out rely on expensive infrastructure, and many depend on government regulation to even exist. Telephone lines need to be laid down; TV stations need to be built; radio spectrum needs to be allocated. There's only so much radio spectrum. Whether anarcho-capitalists like it or not, there needs to be a way to allocate that spectrum through institutional and governmental means. It's not simply a matter of inventing the radio and then selling radio sets in the public market.
Yet that's the way we try to understand these technologies — in isolation. But as you dig deeper, you begin to realize that this is a rather imperfect way to understand our relationship with technology.
Lying About Generations
I'm a Millennial living in a city of 10 million people with reasonably reliable high-speed internet (even if I whine like a spoiled brat during outages). For all intents and purposes, my Baby Boomer parents in suburban St. Paul have a roughly identical relationship with technology access that I do. Are we of the same generation? No, but we're of the same socioeconomic background. We live in cities that allow us to access high-speed internet and use our phones and drive our cars — thanks to the infrastructure that was planned, built and maintained by forces much more complex than we'd like to acknowledge in our daily lives.
What are we left with in this Vox article? The fact that it's funny to watch little kids try to use a rotary phone? Sure. But I'm 30 years old and I've probably only used a rotary phone maybe a dozen times in my life.
What are we proving? Should we give those kids a flip phone and see how well they fare? Some Americans still use a flip phone out of economic necessity. Would the kids' confusion prove that they're from a different generation or that technological advances aren't doled out in this country by birth order, but rather by socioeconomic class?
Dividing people into generations has always been a clumsy way to understand how the world works, but we've accepted it as shorthand. With rising income inequality and stagnating wages, what if the argument that things are getting better doesn't ring true for most people? And what does that mean for our technological landscape?
Describing Americans by generation only makes sense when most people in society are succeeding. It was hard to do in the U.S. at midcentury, but downright laughable here in the year 2014. We don't need more sub-generations to better understand our relationship to technology. We need to ask what's keeping people from gaining entry to the high-tech world of gadgets and internet pipes that a privileged segment of our society have enjoyed for years.
Top image: 1953 ad for Sparton Television in Life magazine, scanned from the book Window to the Future by Steve Kosareff