Tech. Science. Culture.
We may earn a commission from links on this page

Google Opens Up Tool to Make Your Privates Private

The Alphabet-owned company is open-sourcing an AI privacy tool to blur objects in videos such as license plates, passwords, faces, or… other things.

We may earn a commission from links on this page.
A phone with the Google and Android logos in front of a wall of blurred video images.
Google’s new open source tool Magritte can automatically identify objects in videos to add blur to them.
Photo: DANIEL CONSTANTE (Shutterstock)

On Friday, Google announced its machine learning tool called Magritte was going open source. According to information sent to Gizmodo, the tool detects objects inside images or video and automatically applies blur when they appear on screen. Google mentioned the object does not matter, and blur can be applied to, for example, license plates or tattoos.

Google also mentioned the code is useful for video journalists looking to anonymize subjects they’re speaking to with “high-accuracy.” Magritte is a very interesting tool in and of itself, with uses far outside the realm of digital privacy. We don’t have to say it, but of course it could be used to censor more NSFW content on the internet (it’s porn, people, it’s always porn). The tool joins a host of other “privacy” focused tools Google developers have released on the web.

In addition to Magritte, Google is also extolling another so-called privacy-enhancing technology (PET) called the Fully Homomorphic Encryption Transpiler, a phrase that sounds like something straight off a Star Trek script. The code lets programmers or developers work by encrypting data in a set, letting programmers work on it without being able to access personal user information. Google open-sourced the FHE Transpiler last year and it has since been used by the company Duality for performing data analysis on normally-restricted datasets. Duality claimed the data can be processed “even on unsecured systems” since it “satisfies all the various privacy laws simultaneously.”

Advertisement

Of course, this is a big claim though in some cases it does promise to comply with certain regulations. The European Union’s General Data Protection Regulation, for example, forces researchers to implement a certain amount of data security for personal data, which could be anything from a persons’ name to their email address, phone number, or government ID. Meanwhile in the U.S., there is a jumble of state and federal privacy laws that have so far not stopped many companies from buying or selling personal data of all stripes. Really, most companies both big and small (along with military and law enforcement, for that matter) have not been forced to anonymize much or any of the data they’re working with.

So while Google’s open source FHE Transpiler seems like a good tool for allowing researchers to peruse helpful data while keeping users private information private, it won’t see much pickup as long as there remains no overarching privacy law in the U.S.

Advertisement

In its release, Google extolled the benefits of PET projects and its Protected Computing initiative. The company further said “we believe that every internet user in the world deserves world-class privacy, and we’ll continue partnering with organizations to further that goal.” The company has also mentioned it’s working on end-to-end encryption for Gmail, which would be a great development for one of the world’s largest email platforms.

Of course, that ignores Google’s own role in the current issues with data privacy we see today. The company recently paid $392 million to a settle a lawsuit against 40 state attorneys general after the company allegedly misled users about when it was siphoning users’ location data.