Amazon is being hit with a class-action suit alleging that the tech giant’s severs are storing biometric voice data from countless callers, in contravention of an Illinois privacy law.
At the center of the suit is Amazon Connect—a suite of call-center software that Amazon Web Services began licensing out under since 2017. One of the companies Amazon partnered with in order to offer this call-center service, Pindrop Security, specialized in creating what are known as “voiceprints,” which can be used to identify and “authenticate” callers by the cadence of their voice. These specific vocal quirks—much like an iris scan, a finger print, or a facial scan—fall under the umbrella of “biometric data” under Illinois’s Biometric Information Privacy Act (BIPA.) There’s a chance that Amazon ran afoul of the state law by collecting that data without obtaining callers’ consent, storing it on AWS servers, and failing to publicly disclosing its data retention policies.
The three plaintiffs behind the suit came into contact with Pindrop’s tech when they called the customer support line for John Hancock, a major life insurance provider, and were told that they were “no longer required” to use a PIN number to sign in, thanks to Pindrop’s ability to authenticate their calls based on sound alone.
“AWS knowingly intercepted the telephone calls made by Plaintiffs to John Hancock and collected and stored Plaintiffs’ biometric data harvested from those calls, [and] Pindrop knowingly accepted and analyzed intercepted telephone calls to collect and store Plaintiffs’ biometric data,” the suit says.
It then goes on to explain that Pindrop offers its “biometric data software” as a service, and then distributes that software (and the resulting data) to its customers for a hefty fee, all without any consent on the caller’s behalf. Because the three plaintiffs behind the suit are Illinois-based, they were able to point out that this sort of profiteering directly violates some of the core tenants of BIPA. “Pindrop does not tell Plaintiffs it is profiting from its harvesting of their biological information, nor does it obtain their consent. Even if it did obtain consent—though it did not—Pindrop’s practice of profiting from Plaintiffs’ biometric data is a BIPA violation,” the suit claims.
Importantly, the suit identifies exactly why covert recording and storage of biometric data is not only creepy but dangerous (emphasis ours):
When a passcode is used as a security measure, in the event of a data breach an individual may simply change the passcode to prevent unauthorized access to the individual’s compromised account. By contrast, when call centers or customer service personnel use voice biometrics for authentication, in the event of a data breach there is nothing an individual can do to prevent someone from using the individual’s voice biometrics to gain unauthorized access to the compromised account.
Even though this suit involves three Illinois residents invoking the state’s privacy legislation, a judge for their home state’s Southern District Court dismissed this case against Amazon and Pindrop last month, on the grounds that neither company “purposefully directed their activities at Illinois citizens.” The fact that the three callers were based in Illinois wasn’t enough to grant the state jurisdiction, since they were making calls to a Boston based life insurance provider that happened to contract out to two corporations—Amazon and Pindrop—that were both incorporated in Delaware. As a result, today’s suit is effectively a mulligan on the first, but filed in Delaware’s District Court.
When contacted by Gizmodo, Andrew Schlichter, one of the lead councils on the case simply said “We believe that the law was violated, and look forward to bringing the claims to trial before a jury.”
This isn’t the first time that BIPA’s been invoked against a major tech player. Last year, Apple was hit with a similar class action alleging the company unlawfully stored the countless voiceprints collected from people using Siri every day. Just a few months earlier, Google was hit with a similar class action suit over its Google Assistant feature. At the time, the company tried to dodge the claims by (incorrectly) alleging that the plaintiffs needed to prove real, tangible harm coming from this sort of voiceprint collection—something that’s notably not required under the Illinois Supreme Court.
We’ve reached out to representatives at AWS and Amazon, and will update this piece when we hear back.