Instagram chief questioned over delayed rollout of teen safety tools, including nudity filter, court filing shows
A court filing reveals Instagram’s head faced questions over delays in launching teen safety features, including a nudity filter designed to protect young users on the platform.
Prosecutors in a lawsuit examining whether social media apps like Instagram are addictive and harmful pressed Meta on why it took the company years to introduce certain basic teen safety features — including a nudity filter for private messages. In April 2024, Meta rolled out a tool that automatically blurs explicit images in Instagram Direct Messages (DMs), even though the company reportedly recognised the issue nearly 6 years earlier.
A newly unsealed deposition in a federal case shows Instagram chief Adam Mosseri being questioned about an August 2018 email thread that included Meta’s VP and Chief Information Security Officer, Guy Rosen. In that exchange, Mosseri wrote that “horrible” things could happen through Instagram’s private messaging system. The plaintiff’s lawyer suggested that those “horrible” things could include unsolicited sexual images, including explicit photos sent to minors, and Mosseri agreed.
At the same time, Mosseri pushed back on questioning that implied Meta should have warned parents more directly that Instagram’s messaging system was not actively monitored for the most harmful content beyond the removal of CSAM (child sexual abuse material).
“I think that it’s pretty clear that you can message problematic content in any messaging app, whether it’s Instagram or otherwise,” Mosseri said. He also said the company was trying to balance users’ privacy expectations with the need to improve safety.
The deposition also surfaced additional statistics related to harmful experiences on Instagram. According to the testimony, 19.2% of surveyed users ages 13 to 15 said they had encountered nudity or sexual images on Instagram that they did not want to see. Separately, 8.4% of respondents in the same age range said they had seen someone harm themselves or threaten to do so on Instagram in the past seven days they used the app.
While Instagram has introduced multiple teen-focused safety updates in recent years, prosecutors focused less on whether the platform is safer today and more on why it took so long to address known risks — particularly around sexual content being sent directly to minors.
Mosseri was also questioned on other related issues, including an email from a Facebook intern in 2017, who reportedly wrote that he wanted to identify “addicted” Facebook users and explore whether there were ways to help them.
The August 2018 email chain was presented as one example meant to show that Meta understood the risks facing minors. Yet it was not until 2024 that Meta introduced a product feature aimed at reducing exposure to sexual imagery in DMs for teens. That concern includes explicit images sent by adults who may be engaged in grooming. In this process, an adult builds trust with a minor over time with the intent to manipulate, exploit, or sexually abuse them.
Asked for comment, Meta spokesperson Liza Crenshaw pointed to other efforts Meta says it has taken over the years to improve teen safety. “For over a decade, we’ve listened to parents, worked with experts and law enforcement, and conducted in-depth research to understand the issues that matter most. We use these insights to make meaningful changes—like introducing Teen Accounts with built-in protections and providing parents with tools to manage their teens’ experiences. We’re proud of the progress we’ve made, and we’re always working to do better,” she said.
Mosseri’s deposition is part of a broader wave of litigation seeking to hold major tech platforms responsible for harms to teens. This specific case is being heard in the U.S. District Court for the Northern District of California. Plaintiffs argue that social media platforms are defective because they are designed to maximise time spent on the apps, encouraging addictive behaviour among teens. Defendants in the case include Meta, Snap, TikTok, and YouTube (Google).
Related lawsuits are also unfolding in the Los Angeles County Superior Court and in New Mexico.
Across these cases, attorneys are seeking to demonstrate that large technology companies prioritised user growth and engagement metrics over potential harms to their youngest users.
The legal push is underway as governments move toward stricter rules on teen social media use, with a growing number of laws introduced or passed in multiple U.S. states and other countries.
What's Your Reaction?
Like
0
Dislike
0
Love
0
Funny
0
Angry
0
Sad
0
Wow
0