
The largest tech platforms in the world have yet to prove that they can effectively and expeditiously moderate their hellish platforms, and Instagram is no exception.
On Friday, Business Insider reported that IGTVâInstagramâs new standalone video serviceârecommended disturbing videos to users, âincluding what appeared to be child exploitation and genital mutilation.â Business Insider reports that it monitored the appâs recommendation sections for nearly three weeks, using both journalistsâ accounts and an account set up as a 13-year-old with no activity history.
Instagram characterized the platform as âa new app for watching long-form, vertical video from your favorite Instagram creatorsâ when it announced IGTV in June. According to Business Insider, IGTVâs algorithm recommended a video of a young girl in a bathroom who is about to take her top off before the video ends. The videoâtitled âHot Girl Follow Meââwas reportedly suggested to both the journalist and the fake child account. The news outlet reports the account that posted the video also uploaded âPatli Kamar follow me guys plzz,â which was recommended to the fake childâs account and reportedly displayed a young girl âexposing her belly and pouting for the camera.â
The videos were reportedly uploaded by different accounts and stayed active on the app for five days, only removed once Business Insider reached out to Instagramâs media contact. According to Business Insider, the videos had racked up over one million views before Instagram scrubbed them from the platform. The news outlet reports the accounts were not deactivated because the contentânot the accountsâviolated Instagramâs policy.
IGTV also reportedly recommended a video of a penis being mutilated by a motorized saw to the fake childâs accountâthat video was also removed after Business Insider reported it. The account reportedly was not. An Instagram spokesperson told Gizmodo in an email that they had removed all of the content from IGTV that Business Insider reported to them. They also added that one of the accounts had been disabled for violating its community guidelines.
Advertisement
âWe care deeply about keeping all of Instagramâincluding IGTVâa safe place for young people to get closer to the people and interests they care about,â the spokesperson said. âWe have Community Guidelines in place to protect everyone using Instagram and have zero tolerance for anyone sharing explicit images or images of child abuse.â
The spokesperson said that the company works with law enforcement and the Child Exploitation and Online Protection Command (CEOP) to deal with the issue.
The spokesperson added that Instagram has a âtrained team of reviewers who work 24/7 to remove anything which violates our terms.â They also said that both Instagram and Facebook (which owns the service) are amping up their safety and security teamsâFacebook pledged last year to double this team to 20,000 by the end of 2018.
Advertisement
There are, of course, a number of technical efforts platforms like Facebook and Instagram deploy to try and tackle issues like child abuse on their platforms, including algorithms that automatically scan and remove photos of exploitation. But as weâve seen time and time again, even thousands of humans and some of the most sophisticated algorithms arenât masterful enough to sift through the endless stream of content uploaded to the platforms. Whatâs increasingly clear is that the most efficient line of defense shouldnât be a reporter emailing the companyâs press line.