Over the past ten years, smartwatches have gone from futuristic tech relegated to sci-fi flicks to something so common, it’s not a surprise to multiple people sporting them on your daily commute. These wrist-sized computers can’t do quite as much as your smartphone, but these days the modern wristable can do everything from tracking your heart rate to paying for your coffee.
But none of these things happened overnight. At the beginning of the decade, the smartwatch as we currently know it was still years away from becoming reality. In the early 2010s, it was the humble fitness tracker that was just starting to gain some traction. The Fitbit Ultra, Jawbone UP, and Nike Fuelband were among the first trackers on the market in 2011 and 2012—and unlike the other two, the Fitbit Ultra wasn’t even something you wore on your wrist. It was more like a traditional pedometer that you wore clipped to your waistband.
These trackers didn’t do much more than track the very basics like steps, sleep (though not well), and activity. Sure, GPS watches from Garmin, Polar, and a few other makers were also around, but they heavily focused on serious outdoor runner and triathletes. What made fitness bands different was the fact they connected with smartphones via companion apps and websites. It became easier to digitize your metrics, opening the door for the average Joe to participate in the self-quantification movement.
By 2013, fitness bands were everywhere and smartwatches were starting to look less like science fiction. Pebble’s first Kickstarter campaign launched in 2012 and had raised an unprecedented $10.3 million—the most of any campaign at the time. By early 2013, the first Pebble watches had already made their way to consumers’ wrists. That year, Fitbit also released its first wrist-based trackers: the Flex and Force.
But 2014 is where things started to pick up steam. Wearables were the big story of CES 2014, though it was still pretty clear that the category as a whole was still a work in progress. Even so, this was the year that Fitbit introduced the Charge, one of its best selling product lines ever. Android Wear, Google’s wearable operating system, also made its debut at Google I/O 2014. By the end of that year, we saw the first wave of Android-powered smartwatches, including the Motorola’s Moto 360, the LG G Watch, the Samsung Gear Live, and the Asus ZenWatch.
Never mind the fact that Android Wear was buggy as hell and most of the watches felt more like early concept devices than full-fledged products. Momentum was already starting to build. In 2015, optical heart rate sensors started becoming more commonplace on fitness bands, the apps got cleaner, and already the category was claiming its first casualties. The Fuelband died. The second round of Android Wear watches launched. Fitbit launched its first smartwatch, the Surge, and added heart rate monitoring with the Charge HR. And, at the end of 2015, Apple finally announced that yes, in fact, it had created an Apple Watch.
It’s around here that some critics started questioning the whole wearable trend and if it’d ever really take off. It sort of made sense. Fitness bands were fine but inherently limited. Smartwatches offered a promising glimpse into the future, but the consensus seemed to be that they were expensive luxuries that no one needed. (Looking at you, Apple Watch Series 0.) The sad story of Android Wear, which was then rebranded to Wear OS and the Wear OS 2.0, seemed to be more proof.
Looking back it seems a tad hasty to say wearables were dying just a few years into their development. 2016 did claim some major casualties—Pebble, for one, got bought by Fitbit in a fire sale just three short years after barging onto the scene. Frustrations with Android Wear and then Android Wear 2.0 were mounting. Some companies, like Samsung, decided to wash their hands of the platform entirely. After its Gear Live smartwatch, Samsung opted for its own silicon and Tizen operating system for its subsequent Gear and Galaxy watches.
It’s not that wearables were dead, per se. It’s just that progress from 2016 to 2018 felt incredibly incremental year-to-year. In 2017, Fitbit debuted its Ionic smartwatch built on the remains of Pebble’s corpse. It was hideous, but the battery life was phenomenal and was one of the first mainstream smartwatches to introduce sp02 sensors to perhaps, one day, track sleep apnea. Around the same time, the Apple Watch Series 3 added standalone connectivity—as did a bunch of other Android-friendly watches like LG Watch Sport (another, truly horrendous Android Watch) and the Samsung Gear S3.
By 2018, many smartwatches had some combination of built-in GPS, accurate heart rate monitoring, sleep tracking, multi-day battery life, NFC payments, and LTE connectivity. That was a huge leap forward from the laggy, more concept-than-reality watches we saw at the beginning of the decade.
Somewhere along the way, prices also dropped. Sure, the Apple Watch was still expensive as hell, but Fitbit’s popular Versa smartwatch had most of the same features for a mere $200. That, plus greater accuracy, general public consciousness about health, and clever marketing meant people were more willing to spend a couple hundred for a device that promised healthier living.
That was only bolstered by the Apple Watch Series 4 in 2018. This time around, Apple had gotten FDA clearance for a shiny new ECG feature and added fall detection. Suddenly, the Apple Watch went from a luxury device that could help with fitness goals, to a device that could potentially save your life. (How true that is on a large scale is debatable, but stories from users of lives saved have certainly part of their marketing.)
At the end of the decade, smartwatches aren’t just mini-smartphones for your wrist anymore. After the Series 4, a whole spate of watchmakers has rushed to add ECG capabilities. Fitbit, Garmin, and Apple have all added reproductive health tracking. More still are talking about a means to diagnose sleep apnea, while the Series 5 this year added hearing health monitoring. Going into the next decade, it’s likely these devices will continue to lean hard into health and wellness, blurring the line between consumer tech and medical devices.
While fitness trackers reigned supreme ten years ago, it’s now the age of the smartwatch. That also means more casualties. That much was made clear when late this year, Google plopped down $2.1 billion for a struggling Fitbit, which never really found its footing in the smartwatch space. Is that definitive proof that Google might finally launch a Pixel Watch? Not exactly. But after years of letting its wearable platform languish, Google decided in 2019 it was going to shell out $40 million for Fossil wearable tech, buy Fitbit, and emphasize a new focus on ‘ambient computing’ at its Made by Google event in October.
The acquisition also highlights some of the challenges facing smartwatches in the 2020s. Already people are asking what Fitbit alternatives to buy, primarily out of concerns about data privacy. With Apple, Google, Fitbit, Garmin, and Samsung leaning hard into health, the question of who owns and profits off your health data will be even more important in the years to come. The potential benefits are obvious—personalized healthcare and early warning systems could fill some important gaps in current medical research. In the case of reproductive health, it could even shed some new light. But that requires greater collaboration with the medical community and stricter privacy legislation. Right now, it’s not much better than the wild west. Without protocols to ensure all this data we’re collecting is used for good, this past decade of progress will amount to little more than the foundation for an even worse health care dystopia.