Meta Knows How its Platforms Hurt Kids, But Chose Not to Expose the Information

Meta’s Instagram platform poses serious mental health risks to its young users.
In a significant legal development, Tennessee Attorney General Jonathan Skrmetti has filed an unredacted copy of an existing lawsuit against Meta Platforms Inc., formerly Facebook.
The lawsuit, initially filed in October 2023, alleges that Meta’s Instagram platform poses serious mental health risks to its young users.
According to Attorney General Skrmetti, the unredacted complaint reveals that Meta was fully aware of the harm its platforms were causing to children and adolescents. Despite this knowledge, the company allegedly prioritized financial gains over the well-being of its young user base.
Skrmetti affirmed that Tennessee law protects children from companies, big or small, that mislead and harm them, and said, “We will continue to aggressively enforce that law.”
Strong Allegations
The complaint outlines several strong allegations against Meta, including Brain Development Research, as Meta reportedly conducted research on teen brain development to drive product changes aimed at making the platform more addictive.
The findings indicated that teenage brains are highly sensitive to dopamine, contributing to addictive behaviors and prolonged scrolling. Despite this knowledge, Meta allegedly failed to address these risks adequately.
Harmful Filters were also included in the file. Meta founder Mark Zuckerberg personally intervened to introduce “selfie” filters that mimicked plastic surgery effects.
Despite warnings from experts within the company, these filters had devastating effects on young women, contributing to body dysmorphia.
The company deliberately neglected Addictive Behavior. The complaint highlights that teens often view Instagram through an “addict’s narrative,” acknowledging negative behaviors but feeling powerless to resist. Meta allegedly downplayed these concerns.
One of the striking accusations was the misleading “Time Spent” tools as Meta promoted its “Time Spent” tools as a way for users and parents to manage engagement on Instagram.
However, when inaccuracies in the data surfaced, Meta chose to mislead users rather than address the issue transparently.
Regarding harm to young women, Meta’s internal research revealed that nearly half of teen girls frequently compare their appearances on Instagram, with one-third feeling intense pressure to look perfect.
The platform’s impact on body image and self-esteem remains a significant concern.
Harmful Content was surfacing in a troublesome way. Now, young users regularly encounter harmful content on Instagram.
Among 13–15-year-olds, 13% reported unwanted advances, 27.2% witnessed bullying, and 8.4% encountered self-harm-related content within the last seven days. Meta allegedly failed to disclose these findings.
Former Meta employees have claimed that the company intentionally designed its platforms to be addictive, capturing more of users’ time and data.
This lawsuit sheds light on Meta’s practices and raises critical questions about the responsibility of tech companies toward their young users’ mental health.
Whistleblower Testifies
In a Senate panel hearing, a former employee turned whistleblower at Meta revealed that top executives within the social media giant were aware of the detrimental impact their Facebook platform had on young people.
Despite their knowledge, they allegedly concealed the information to safeguard their profits.
Arturo Bejar, the former director of engineering for Facebook’s Protect and Care program, presented compelling statistics to corporate executives.
These data demonstrated that teenagers and children frequently suffer from depression due to online bullying, sexual solicitation, and body shaming facilitated by the platform. Tragically, some individuals were driven to suicide.
Bejar emphasized that Meta executives, including Mark Zuckerberg and Sheryl Sandberg, were fully aware of the harm inflicted on children.
He asserted, “Their awareness meant they knew about the issue but failed to take action. It is time for Congress to act.”
The Senate Judiciary Subcommittee on Privacy, Technology, and the Law cited Bejar’s testimony as a compelling reason to advance the proposed Kids Online Safety Act.
This legislation aims to protect minors from abuses on social media platforms. If passed, it would empower state attorneys general to enforce regulations through legal action.
Key provisions of the bill include mandatory installation of settings and algorithms by Facebook, Instagram, and other platforms.
These measures would restrict minors’ access to personal data and provide parents with tools to manage privacy settings for their children’s accounts.
Additionally, social media platforms would be prohibited from advertising age-restricted products or services to children.
-They would also be required to report annually to the federal government on foreseeable risks to minors stemming from platform use.

Anti-Censorship
However, social media platforms and anti-censorship groups oppose these restrictions.
They argue that such measures could lead to increased government surveillance of online free speech and hinder artists’ work.
Critics of the proposed legislation, sometimes referring to themselves as the “Don’t Delete Art” movement, fear that it may drive users away from social interaction and easy access to information.
As of the third quarter of this year, Facebook boasts approximately 3.05 billion users worldwide. Platform usage increased by 0.63% from the previous quarter and 3.08% year over year.
Meta reported a staggering $116 billion in revenue last year, with the majority—$113.6 billion—generated through advertising. Advertising rates correlate directly with the number of Facebook users.
Senator Marsha Blackburn (R-Tenn.) expressed skepticism regarding Facebook’s claims of sincere efforts to protect children from potentially harmful online content. She suggested that profit motives might be the driving force behind their actions.
The renewed Senate effort to address alleged social media abuses stems, in part, from a recent lawsuit filed by state attorneys general against Meta.
The lawsuit accuses the company of endangering children through addictive features on its Facebook and Instagram platforms.
Meta’s algorithms, alerts, notifications, and infinite scroll features allegedly encourage prolonged usage among kids, exacerbating mental health issues.
Photo filters, in particular, are said to contribute to body dysmorphia-related depression.
The lawsuits seek an injunction to halt these harmful features, along with restitution and penalties.
During Tuesday’s Senate hearing, Senator Dick Durbin (D-Ill.) highlighted the challenge of reining in Facebook and Instagram’s dominance. He noted that Big Tech wields significant influence in this arena, often thwarting legislative reforms through extensive lobbying efforts.
Meta responded to the lawsuit by affirming its commitment to providing safe and positive online experiences for teens. The company claims to have already introduced over 30 tools to support teens and their families.