A family advocacy group called Parents Together published an open letter Thursday demanding a meeting with the heads of Meta and ByteDance, arguing that the companies knowingly expose children to a variety of dire threats, including the risk of suicide, and that they refuse to address these problems in lieu of growth and profit.
The open letter describes a number of horror stories from families who say their children fell victim to the harms posed by social media, including suicides, accidental deaths from viral “challenges,” hospitalizations from eating disorders, sexual abuse, and more. Meta and ByteDance, the parent companies of Facebook and TikTok, respectively, “have imposed on unwitting children and families – anxiety and depression, cyberbullying, sexual predators, disordered eating, dangerous challenges, access to drugs, addiction to your platforms, and more—every single day,” Parents Together Action said in the letter. The companies “have chosen your profits, your stockholders, and your company over children’s health, safety, and even lives over and over again.”
Parents Together Action says it wants to bring Meta and TikTok together for a meeting with the parents whose children fell victim to social media harms, resulting in emotional trauma, dangerous eating disorders, and even death by suicide.
“These companies know how much harm their products are doing to kids, and yet they continue to experiment on American children without any concern for their health,” said Shelby Knox, campaign director for Parents Together Action.
Meta declined to give a statement on the record to Gizmodo in response to the letter, but shared a long list of changes the company has made to better protect children, including setting teenagers’ Instagram accounts to private by default, developing new technology to prevent adults from contacting or even encountering children, and making efforts to limit kids’ exposure to potentially harmful content. The company says its made over 30 changes to its platform to protect children in recent years, including some called for by advocates such as Parents Together Action.
TikTok did not return a request for comment. A representative for Snap declined to comment. A Google spokesperson said “We have invested heavily in creating safe experiences for children across our platforms and have introduced strong protections and dedicated features to prioritize their well being.”
“These companies spend a lot of money advertising what they’re doing to keep kids safe online, but it’s a PR stunt rather than actual steps towards keeping kids safe. All of their ‘solutions’ put the onus on parents, and it’s deeply disingenuous,” Knox said. “We want Meta and ByteDance to hear these parents’ stories. If they don’t think their platforms are unsafe, they can tell these parents to their faces.”
Over the weekend, newly unsealed documents from an ongoing lawsuit against Snap, Meta, Google, and ByteDance showed that employees and executives at the social media companies were well aware of the harms their services posed to children, but failed to address them. The lawsuit singles out Mark Zuckerberg, in particular, alleging he was warned personally.
According to the documents, an unnamed Meta employee wrote to Zuckerberg, “We are not on track to succeed for our core well-being topics (problematic use, bullying & harassment, connections, and SSI [suicide and self-injury]), and are at increased regulatory risk and external criticism. These affect everyone, especially Youth and Creators; if not addressed, these will follow us into the Metaverse.”
Meta chose not to address these issues because “the growth impact was too high,” the suit alleges.
Gizmodo is in the process of redacting and publishing thousands of internal documents from Meta as part of the Facebook Papers project. In a recently published batch of documents about children and teens, Meta employees discuss how Instagram can cause a “downward spiral” into eating disorders and body image issues. The employees likewise try to understand how kids are actually using Meta products. In other documents, employees discuss Meta’s efforts to target kids and keep them on Instagram and Facebook.
Lawmakers are paying closer attention to the risks social media poses to children. At a recent Senate Judiciary hearing, Congress heard from victims and experts about suicides, cyberbullying, social media addiction, and more. A number of proposed bills seek to force the tech industry to do more to address such problems.
If you or someone you know is having a crisis or contemplating suicide, please call or text the Suicide and Crisis Lifeline at 988. You can also call the National Suicide Prevention Lifeline at 800-273-8255 or text the Crisis Text Line at 741-741.