Meta held accountable for teen harm allegations — what comes next?
Meta faces accountability over alleged harm to teens, raising questions about platform safety, regulation, and future social media protections.
Tech giant Meta suffered a major legal setback last week after losing a lawsuit brought by the state of New Mexico, marking the first instance in which a court has found the company liable for endangering the safety of minors. The significance of this ruling was immediately compounded the following day, when a jury in Los Angeles delivered another blow, determining that Meta knowingly designed its platforms in ways that encouraged addictive behaviour among children and teenagers. The jury concluded that this contributed to harm suffered by the plaintiff, a 20-year-old identified as K.G.M.
These back-to-back rulings could pave the way for a surge of legal actions focused on Meta’s strategies to attract and retain younger users, despite internal awareness of potential negative effects on mental health. Thousands of similar cases, including ones modelled after K.G.M.’s lawsuit, are already pending. In addition, attorneys general from 40 U.S. states have filed cases resembling the one brought by New Mexico.
Unlike previous legal challenges involving social media platforms, which often centred on user-generated content, these cases focused instead on the structural design of the platforms themselves. Features such as infinite scrolling and constant notifications were examined as potential contributors to addictive usage patterns.
According to digital media attorney Allison Fitzpatrick of Davis+Gilbert, this legal approach echoes tactics previously used in litigation against the tobacco industry. By shifting attention away from content and toward product design, plaintiffs were able to sidestep traditional First Amendment defences. Fitzpatrick noted that this strategy proved effective in both cases.
Following a six-week trial, the New Mexico jury determined that Meta had violated the state’s Unfair Practices Act and imposed the maximum penalty of $5,000 per violation, resulting in a total fine of $375 million. In the Los Angeles case, the jury apportioned 70% of the liability to Meta and 30% to YouTube, awarding a combined $6 million in damages. Other companies, including Snap and TikTok, had settled their portions of the case before trial.
While the financial penalties themselves may appear modest relative to Meta’s scale, legal experts warn that the cumulative impact of multiple similar rulings could become substantial.
Meta has indicated that it plans to challenge the outcomes. A company spokesperson stated that it disagrees with the verdicts and intends to appeal, arguing that attributing teen mental health issues to a single factor oversimplifies a complex issue and overlooks the positive role that online communities can play in providing connection and support.
During the litigation process, previously undisclosed internal documents from Meta were made public. These materials suggested a pattern in which the company was aware of the potential harms associated with its platforms but did not take sufficient action. Some documents also pointed to deliberate efforts to increase engagement among teenage users, including encouraging usage during school hours or through secondary “finsta” accounts — informal profiles used by teens to avoid parental or institutional oversight.
One internal report from 2019 summarised findings from 24 in-depth interviews with users whose behaviour had been flagged as problematic, a category estimated to account for about 12.5% of users. The report concluded that external research indicated a negative impact of Facebook on users’ well-being.
Other internal communications cited statements by Mark Zuckerberg and Adam Mosseri emphasising the importance of increasing engagement among younger audiences. In one instance, Zuckerberg suggested that for Facebook Live to succeed with teens, the platform would need to avoid alerting parents or teachers. Internal emails also revealed candid remarks from employees about optimising product features to encourage frequent app usage, even during school activities.
Meta has responded by noting that many of the documents date back nearly a decade and emphasising that it has since taken steps to improve safety. The company pointed to features such as Instagram Teen Accounts, introduced in 2024, which include default privacy settings, restrictions on tagging and mentions, and time-limit reminders that notify users after 60 minutes of use. For users under 16, changes to these settings require parental approval.
Former Meta employee Kelly Stonelake, who worked at the company from 2009 to 2024 and is currently involved in a separate lawsuit alleging workplace discrimination, said the revelations align with her own experiences inside the organisation. Stonelake previously led go-to-market efforts for the VR platform Horizon Worlds and claimed that concerns she raised about inadequate moderation tools for younger users were not sufficiently addressed.
The issue of online safety for minors has also attracted attention at the federal level, particularly following disclosures by whistleblower Frances Haugen in 2021, which revealed internal research suggesting that Instagram negatively affected teenage girls’ mental health.
Lawmakers have introduced several proposals to improve online safety for children. However, some privacy advocates argue that certain measures — such as age-verification requirements — could raise broader concerns about censorship and surveillance.
Stonelake, who previously supported the Kids Online Safety Act, has since expressed reservations about its current version. She pointed to provisions that could override state-level regulations and potentially limit the ability of families, schools, and states to pursue legal action against technology companies.
The broader debate highlights the complexity of regulating digital platforms, as policymakers attempt to balance user safety, privacy rights, and freedom of expression. As legal pressure mounts and additional cases move forward, the outcomes of these recent rulings could play a critical role in shaping how social media companies design and operate their platforms in the future.
What's Your Reaction?
Like
0
Dislike
0
Love
0
Funny
0
Angry
0
Sad
0
Wow
0