Yesterday afternoon, Apple and Google’s highly anticipated contact-tracing tech was officially rolled out, with Apple already baking it into the backend of its latest update. The question now is whether it’ll be the godsend for public health that both of the tech giants initially pitched, or if doubts surrounding the tech’s overall efficacy and trustworthiness will win out.
As we’ve previously described, this joint API was designed to be coded into apps designed by public health officials—and the folks who voluntarily download these apps, in turn, have their phones transformed into Bluetooth beacons meant to track whether you’ve been in close contact with someone who tested covid-positive. It’s a pitch that’s already won over officials in 22 countries, along with those in several states across the US.
That’s not to say that everyone’s a convert. While states like North Dakota, South Carolina, and Alabama—along with devs from Germany and Switzerland—are embracing this new API for their own apps, other authorities have been less than accepting. In the UK, for example, officials reportedly clashed with the company’s choice to make any data collected via app decentralized by default: a choice that preserves user-privacy but also makes it more difficult to track the virus’s spread over time. Meanwhile, other authorities—some on US soil—were turned off by the lack of location data collected from a device’s GPS under the new API.
That said, the apps we’ve seen developed thus far leave a lot to be desired. The UK’s app that was developed in Apple and Google’s absence ended up being a broken mess that actually violated the local laws surrounding digital privacy. In India, the Bluetooth beacons used to tie citizen’s phones together inadvertently ended up opening these devices to some pretty basic (and pretty damaging) hacks. An initial release of Russia’s contact tracing app didn’t only track a downloader’s location, but their entire address book and camera roll as well.
Right now, the tradeoff for the states downloading these apps seems to be between efficacy and user privacy. By forging ahead with their own app, authorities run the risk of creating something buggy and battery-draining. By partnering with Apple and Google, their app might be more effective on a technical level, but lacking when it comes to the public health goals it was created for in the first place.
And all of this is assuming that people will be downloading these apps in the first place. A recent poll from the University of Maryland clocked roughly 40% of respondents as no-gos for downloading any app developed with the Apple-Google API, with most respondents citing each company’s shoddy track record with privacy as the reason.
It means that right now, public health officials are facing an uphill battle in on-boarding this tech that might not actually pay off in the long run. After all, what’s the point in pushing an already skeptical public to download an Apple- and Google-fied app when the general consensus is that this app might not actually do what it’s promising?