This Japanese Smartphone Uses AI to Prevent Users From Saving and Sharing Naked Selfies

Illustration for article titled This Japanese Smartphone Uses AI to Prevent Users From Saving and Sharing Naked Selfies
Screenshot: Gizmodo, Illustration: Tone Mobile

Worried about what online shenanigans your child might get into with their first smartphone? In Japan, a company has released a budget phone with at least one feature you won’t find on a flagship iOS or Android device: an AI-powered sensor that prevents naked selfie shots from being saved or shared.

Advertisement

With a price tag of around $180, Tone Mobile’s Tone e20 boasts modest specs with a 6.2-inch display, Helio P22 processor powering Android 9.0, a trio of rear cameras that top out at 13-megapixels, fingerprint sensor on the back, and a 3,900 mAh battery. Those aren’t specs that you’ll be bragging to your friends about, but the e20 is, on paper, a decent device for those looking for a smartphone on a budget, or those looking to give an irresponsible teenager their first taste of mobile computing.

But teens will be teens, and the transition through puberty can be difficult and often result in some poor decision making, particularly with access to a device capable of sharing photos with billions of people online. That’s where the Tone e20's “Smartphone Protection” feature comes into play. The device’s camera app features a built-in (and seemingly mandatory) image recognition tool that uses artificial intelligence to identify and automatically label photos as inappropriate.

Advertisement
A look at the Tone e20
A look at the Tone e20
Screenshot: Tone

According to SoraNews24, while Tone Mobile hasn’t released specific guidelines on what makes a photo acceptable or inappropriate, the AI is tuned to spot nudity, which automatically triggers a warning that the photo isn’t allowed and will be immediately deleted—although that’s not entirely truthful. For e20 phones given to teens by their parents, the Smartphone Protection feature can also be linked to the parent or guardian’s phone giving them a notification that an attempt has been made to snap an inappropriate selfie. The notification includes not only details about when the photo attempt was made, but where it was snapped using the e20's GPS data, as well as a pixelated thumbnail version of the photo in question which Tone Mobile promises is not stored on its cloud servers.

Teens are resourceful, however, and it remains to be seen how deeply embedded the naked selfie detection tools are in the Tone e20. Simply taking a screenshot with the camera app open, instead of pressing its shutter button, could potentially circumvent the censorship tool and allow inappropriate images to be easily stored and shared. The feature can also be disabled in the phone’s settings so consenting adults won’t be censored, but is presumably password-protected so that it can’t be easily turned off in the hands of younger users.

Share This Story

Get our newsletter

DISCUSSION

I’m sure their heart is in the right place, but I’m inclined to think this is a case of “what could go wrong?”

I mean, to start, the use of the term “AI” is way fucking off. I’m pretty sure a self-aware phone isn’t adapting its programming language to learn the details of your 14 year old daughter’s body. It’s probably more like YouTube’s nudity filter that flags certain skin tones or patterns.

Of course, YouTube has a lot more computational power, and ASSUMING this phone isn’t sending the unfiltered image off to a remote server for analysis, I can probably imagine it’s going to fail hilariously when it thinks some 15 year old’s face looks like a butt, or when some darker-toned teen is able to circumvent the system entirely because the system wasn’t tested well enough with people of various skin tones.

It also doesn’t help that it can send images, specifically suspected nude images of underage users to any other phone without consent, and yet it’s specifically advertised NOT to allow those images to be shared.