Apple Pulls iPhone App That Allowed for Child Pornography Uploads

Apple has pulled BeautyMeter—the iPhone/iPod touch app that allowed users to upload naked pictures of themselves for others to rate—after a 15-yo girl published this picture showing her bare breasts and pubic hair.


Charlie Sorrel at Wired argues correctly that Apple will be damned with 17+ apps no matter what:

The problems for Apple are clear. By setting itself up as a guardian of the store, Apple can't win. Any time a controversial application is approved, or non-allowed elements are snuck into an application post-approval, Apple is blamed. If these apps are pulled ahead of time, Apple is called out as an evil censor.

However, that doesn't mean Apple should ban the 17+ app sex-related category to avoid conflicts. There are plenty of adult-oriented applications that won't allow for this kind of dynamic content. But then again, the fact is that any application that allows you to upload pictures and share them could be used to do exactly the same. So where should Apple stop, then? Should they ban any app that can be used to publish pictures or videos? Shouldn't the developers—and the users—be responsible about this and not Apple.

The problem for Apple is probably not a legal one, but one of public perception, with people and mainstream assuming that—just because it runs on the iPhone—it is Apple's app. I'm afraid that, if they want to keep the market fully open, they would have to find the public image battle instead of just pulling the applications that allow for this kind of behavior. [GadgetLab]


I am all about First Amendment rights and free speech, but, in the case of children, I have to applaud Apple for a ZERO tolerance policy. Not only is it the right thing to do, its the right thing to do in the eyes of shareholders and customers.

Let kids stay kids, even if they dont want to. And for those who prey on children, KNOW that there is a special hot room in hell waitng just for you.