Related
What you need to know about the new direct marketing guidance note
Nadine Mather and Chloë Loubser 3 days
Meta contractor dismissed threats to moderators by Ethiopia rebels
Ammu Kannampilly 10 Dec 2024
In 2022, parents in New York sued Meta, alleging that their daughter’s compulsive use of Instagram led to an eating disorder, self-harm, and suicidal thoughts.
The lawsuit argues that Instagram’s algorithms exposed the girl to harmful content, worsening her body image issues.
This lawsuit is part of a broader pattern, with research linking social media use to harmful behaviours such as isolation, self-harm, and suicidal ideation among teens – which led to 33 US States filing a case against Meta in October last year.
“Kids and teenagers are suffering from record levels of poor mental health and social media companies like Meta are to blame,” said New York Attorney General Letitia James in a statement.
“Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem.”
“We know that parents are deeply concerned about the impact of social media on their teens, and we’ve listened,” said Mark Zuckerberg, CEO of Meta, in a statement introducing Teen Accounts.
“Teen Accounts are designed to help ensure that teens can use Instagram safely, with the protections they need in place from the start.”
The urgency of these protections was ignited by internal documents, known as The Facebook Papers, which were made public in 2021.
These papers revealed that Meta was aware that Instagram exacerbated body image issues for one in three teenage girls.
Frances Haugen, a former Facebook employee turned whistleblower, testified before the US Senate that Meta was aware of how its platform contributed to issues such as eating disorders, but failed to take meaningful action.
Dr Rachel Rodgers, associate professor of Applied Psychology at Northeastern University added her support to the launch, saying that “Instagram Teen Accounts reflect the importance of tailoring teens’ online experiences to their developmental stages.”
Given the research that has emerged, it’s clear that younger users need additional safeguards to prevent exposure to harmful content.
The newly launched Teen Accounts automatically place teenagers under 16 into protected settings that limit who can contact them, restrict exposure to sensitive content, and ensure that their time on the platform is monitored.
Meta’s efforts to limit potential harm include default private accounts for teens, restrictions on messaging from unknown users, and the automatic application of the strictest settings for content filters.
Teens will also receive notifications to limit screen time, while a "sleep mode" will mute notifications overnight.
Parents have the option to control and supervise their child’s experience further, adding another layer of protection.
Yvonne Johnson, president of the US National PTA, praised the initiative, stating, “These protections show that Meta is taking steps to empower parents and provide safer, more age-appropriate experiences on the platform.”
As these protections roll out in the coming months across the US, UK, Canada, and Australia, Meta aims to reassure parents and policymakers that they are taking proactive steps to address the challenges posed by social media to young users.
“These updates offer a balanced approach, empowering parents with essential oversight while respecting teens’ right to participate and explore,” added Lucy Thomas, CEO and cofounder of antibullying initiative Project Rockit.
“In an evolving online landscape, it’s vital that young people engage meaningfully and safely.”