When Apple software chief Craig Federighi first announced in June that the company would be issuing app-specific “nutrition labels” as part of its privacy-focused iOS14 update, there was reason to be cautiously optimistic. The basic pitch—that these labels would give us a chance to peruse an app’s data-collecting specs before downloading the app—is an idea that should keep the most invasive apps from hiding behind the most inscrutable privacy policies.
In a new information portal, Apple details some of the “privacy details” that app developers will soon be required to list off on their app’s product page, apparently as part of the new privacy label. While we don’t know when the average app user will see these new labels (aside from a vague “later this year”), Apple’s new mandate states that devs submitting any new apps or app updates from Dec. 8 onward will be required to package their app with details describing the data their particular apps hoover from a given device, and how that data gets used.
This caveat is key to understanding how any of this affects users. These sorts of linguistic loopholes are exactly what bad actors use to keep collecting data from countless people, even if it’s the last thing users want. By better defining what “data” means, Apple could, hopefully, keep data-brokering developers from gaming these new “nutritional labels” the same way they’ve been gaming privacy regulations until now.
Put another way, the way Apple defines these terms is what dictates whether these labels actually, well, label the app that they’re slapped onto in any meaningful way.
Because there are so many definitions of the word “data” out there, Apple helpfully gives a list that app developers can check off when asked what flavor of data their apps collect. Skimming these sorts of disclosures won’t only help a person see if, say, a particular app tracks their location a bit more than they’d like, but also whether that app might try to sniff around their address book or web browsing history.
- If an app asks for “Contact Info,” then that means the app’s asking its users for any sort of intel that can be used to “contact [them] outside the app”; think names, email addresses, phone numbers, or physical addresses. This is a bit different than an app asking for “Contacts Info.” (Note the “s.”) Per Apple, this encompasses a list of contacts in a person’s phone or their address book.
- “Health and Fitness Info,” meanwhile, encompasses the sort data pulled by clinical- or research-facing apps, like health records that might be pulled from Apple’s Clinical Health Records API, movement data pulled from an Apple Watch app using the company’s Movement Disorder API, or anyone employing the company’s HealthKit API. Apple defines “fitness” or “exercise” data as the kind collected by its Core Motion API, or its step-counting Pedometer API.
- Apps asking for “Financial Info” are asking for, well, financial info: things like the form of payment you’re using, your credit/debit card number, or your bank account number. Apps asking for “credit info” (like your credit score) also fall under this financial umbrella, as do apps asking for your salary, income, assets, or details about your debts. Per Apple, if an app has payment features baked in, but that payment intel is entered outside of a given app—outside of a developer’s access—the company doesn’t consider that data “collected” and doesn’t need to be disclosed.
- Somewhat relatedly, Apple defines an app asking about your “Purchases” as one asking about your “purchases or purchase tendencies.”
- An app that asks for “Location Info” can mean one of two things. That app might be hoovering your “precise location,” which means that app tracks your locale using super granular latitude and longitude measurements, the way, say, a mapping app might be. Otherwise, that app is collecting your “coarse location,” meaning that your location’s being collected at somewhat of a lower resolution, like the kind that Apple’s approximate location tracking might offer. It’s worth noting that, as of iOS14, you can manually go through each app that you’ve downloaded to decide whether a given app should have access to your precise location, your coarse location, or no location at all. You can also only allow location sharing while the app is in use or never.
- Apps asking for “Browsing History” aren’t only asking for your browsing history in-app, but any content you might have viewed “that is not part of the app, such as [on] websites.” Apps asking for your “Search History,” on the other hand, exclusively pertain to intel about searches performed within the app itself.
- Apps asking for “User Content Info” run the gamut from apps asking for access to a user’s photos, videos, or voice recordings, to ones asking for access to any data generated within a particular game, or as part of a request for customer support.
- Apple marks an app as asking for “Sensitive Info” when it asks for data ranging from data on a person’s race, ethnicity, or sexual orientation, as well as data on a person’s religion, pregnancy status, or political opinions. Biometric data in all of its forms also falls under this umbrella.
- Finally, apps that collect “Usage Data” are measuring moves like the clicks or taps you make within the app, while “Diagnostic Data” includes details like crash logs or any app-specific technical features.
The important thing to remember here (and just in general) is that sometimes your more sensitive details can be inferred by using pretty benign-looking kinds of data, so even Apple’s definitions leave room for additional prying behind the scenes.
Here’s an example: Let’s say you download an LGBTQ-focused dating app that, on its product page, says it only discloses basic usage details—like whether you downloaded this app, or how often you use it—to its third-party partners. Even if the app doesn’t collect any “sensitive info” from you on paper, the fact is that third parties can infer that you’re LGBTQ because you downloaded this particular app. I’ve written before about how marketers can use data from the specific apps you download (among other innocuous data points) to tie together everything from your race to your sexuality to your basic income.
Aside from listing off all the different types of data that their app collects, any app developers will also need to list whether each type is linked by them or any of their third-party partners to the app-user’s identity—either via their Apple account or through some device-specific ID like the one that we broke down here.
As Apple puts it, “data collected from an app is often linked to the user’s identity, unless specific privacy protections are put in place before collection to de-identify or anonymize it,” through steps like stripping out any “direct identifiers.”
What Apple doesn’t mention here is that the bulk of data collected by shady sorts of vendors is typically free of any sort of directly identifying material by design. Going back to the dating app example from before, the kind of data that’s pulled when you download the app generally doesn’t include any sort of details that can be traced back to you, the person who downloaded that particular app. What it will probably include are identifiers that are either hashed or meet the baseline for what people in the marketing space agree is anonymous enough. Meanwhile, there are tech companies in the data-crunching space that exist solely to gobble up these anonymous identifiers in bulk and spit out a fully targetable audience on the other end. That means even if a certain app developer says that they won’t tie your download details to something more personal, there’s often nothing really stopping them from passing this data off to a third-party to do the dirty work of “identity resolution.”
Well, almost nothing. Apple’s guidelines state that any developer who states their app’s data collection isn’t “linked” to a given user “must not attempt” to pull any of these shenanigans, and “must not tie the data to other datasets that enable it to be linked to a particular user’s identity.” But because this sort of identity-tying-tech work would be happening outside Apple’s purview, the company is pretty much asking its developers—including the shadiest players in the data-brokering space—to pinky swear that they won’t continue.
We’ll get into the weirdly complex nitty-gritty of how the iOS 14's tracking specs actually work in a future article. But in short: Apple defines an app that “tracks” its users as one that ties data from any of the categories above with data pulled from some external source—say, platforms or websites that aren’t owned by that app’s developer—and then use that data specifically to serve those users with targeted ads. Developers also meet the bar for “tracking” users when they share data pulled from a given app with a data broker.
Like the data that’s “linked” to you, any data that’s used to “track” you will be put on display as part of the given app’s privacy product page whenever Apple rolls them out.
In some cases, whether an app is “tracking” you is pretty cut and dry. Just to use the example of that LGBTQ dating app again, let’s say its developer onboarded a software kit (also known as an “SDK”) like the one used by Facebook’s Audience Network, which as you might remember solely exists for serving targeted ads inside third-party apps. In that case, that developer should be pretty aware that their app is going to do some degree of ad tracking and targeting whenever a given user downloads it.
Ultimately, the reason why that case is so cut-and-dry is that Facebook—believe it or not—is honest about describing what its SDK’s actually do in the first place. Not every tech company follows that same MO. In an example from Apple’s own documentation, the company specifically calls out data shared with an imaginary data broker “solely for fraud detection or prevention or security purposes” as an example of data-sharing that wouldn’t fall under the tracking umbrella. But again, this is the type of disclosure that only really works if that broker’s being honest with that developer, and that developer is being honest with Apple.
So, now that we have more details about what Apple is requiring developers to disclose, should we still be cautiously optimistic? Overall, yes—this is a drastic improvement over the status quo we’ve been dealing with until now. But there’s still a level of trust we (and Apple) must put in developers, and there’s only so much power even Apple has to police a data-collection ecosystem that remains opaque, under-regulated, and filled with scoundrels.