Workout Data From Fitness App Used to Identify Government Spies and Military Personnel

Equipment lies ready to go November 14, 2002 in an overall view of Ft. Bragg, North Carolina.
Equipment lies ready to go November 14, 2002 in an overall view of Ft. Bragg, North Carolina.
Photo: Getty

In the latest incident of seemingly innocuous data sharing leading to potentially dangerous exposure, the popular fitness app and activity trackers Polar Flow has been revealing the location of military and government personnel working at sensitive locations, according to ZDNet.

The report cites an investigation conducted by Dutch news site De Correspondent and Bellingcat, which discovered it was possible to find workout information recorded by Polar Flow and use it to potentially identify the names employees working at military bases and government buildings.

Per ZDNet, the technique included accessing the developer API from Polar, the Finnish-based company that produces Polar Flow. Through the API, a person can not only explore public data that users willingly share, but could also retrieve fitness tracking information from users who have their profiles set to private. The API also didn’t put a limit on the number of requests a person could make, so it was feasibly possible that someone could scrape information from the millions of users who rely on Polar Flow to track their workouts.


Using that essentially unfettered access, it became possible to identify people working at sensitive locations like military bases. De Correspondent explained the technique simply required looking up a known government or military installation, finding a work out that was tracked there, then exploring that user’s other workouts. Odds are, the user has worked out at or near their home in the past.

Those few data points allowed the researchers to identify more than 6,400 users believed to be working at sensitive locations. According to ZDNet, the researchers pegged employees of the NSA, the White House, British intelligence agency MI6, the Russian GRU, and others. That data was also used to identify staff at nuclear storage facilities, missile silos, prisons, and locations like Guantanamo Bay.

Once a user is exposed through the technique discovered in the investigation, their locations become a lot more interesting and potentially revealing. For example, reporters at De Correspondent were able to spot users identified as foreign military and intelligence officers working out near sensitive government locations in the US.

Polar acknowledged the issue in a statement and said the situation is being addressed, though it downplayed the potential seriousness of the data exposure:

It is important to understand that Polar has not leaked any data, and there has been no breach of private data. Currently the vast majority of Polar customers maintain the default private profiles and private sessions data settings, and are not affected in any way by this case. While the decision to opt-in and share training sessions and GPS location data is the choice and responsibility of the customer, we are aware that potentially sensitive locations are appearing in public data, and have made the decision to temporarily suspend the Explore API.


This isn’t the first time a fitness app has accidentally exposed potentially sensitive information about the government and military. Earlier this year fitness tracking map Strava came under fire once it was discovered that the company’s heat maps, which show user activity around the world, could be used to identify military bases, including some locations that were previously secret.



Nights and weekends editor, Gizmodo

Share This Story

Get our `newsletter`



I work in a sensitive location, it’s not a secret that I do so, nor is it a secret who I work for both military or civilian in the past. It’s actually all on my LinkedIn profile. Hell go back and there are pictures of me in uniform on my facebook as well, and that’s fine because it isn’t a secret.

When it’s important that information is private, you just don’t put it up there and you don’t use services that can track you. This is why in actually secure facilities you check all your personal electronics in when you enter and collect them when you leave.

At the end of the day this is personal choice of the users. And fundamentally there isn’t a problem with it. I’m sure there maybe are a few people that probably should have set profiles to private or not used it, but if I was to make an educated guess I would say it would be a small percentage of that larger dataset.