Meta says it is taking additional steps to help protect the youth on its platforms. Now, anyone under the age of 16 years old can enroll in new privacy settings that are meant to limit who can see their Facebook profile. Likewise, Meta says it’s also testing the removal of the messaging button on teen’s Instagram profiles when viewed by an adult.
In a company press release, Meta further detailed its initiative to add safeguards to its Facebook and Instagram products that will hopefully protect teenage users from “potentially suspicious adults.” The major overhaul in this initiative was Meta setting new privacy defaults for teen Facebook profiles.
Any new users under 16 (or under 18 in certain countries) will automatically be enrolled in the new settings, which includes restricting who is able to view your friends list, tagged posts, pages and people you follow, as well as who is allowed to comment on your posts. Pre-existing teenage users will not be automatically enrolled in the new settings, but can opt in whenever they want.
On both Instagram and Facebook, teen users will also now be able to report a user immediately after they have blocked them in both app’s messaging interface. Users will first be greeted with a prompt that asks if they know the person they are messaging, and will then be able to block, restrict, or report the user depending on the platform. The dialogue box also explains what blocking and reporting will do, and guides the user to more detailed information pages.
Meta also announced that it is currently partnered with the National Center for Missing and Exploited Children to construct a resource for teens who are concerned that lewd images of them are being non-consensually distributed. Details were sparse on what this platform will look like, and NCMEC did not immediately return Gizmodo’s request for comment.
Jeanne Moran from Meta told Gizmodo in an email that this platform will allow teens to submit their own case to Meta/NCMEC’s platform. “For increased safety and privacy, images never leave the teen’s device and are never saved by NCMEC or participating tech companies,” Moran wrote. NCMEC will then generate a unique, anonymous hash that will help find matches of the image using machine-learning from Meta.
Update November 21 11:40 am: This story was updated with additional comment from Jeanne Moran from Meta.