xAI faces a lawsuit over Grok AI allegedly generating inappropriate images of minors

Elon Musk’s xAI is facing a lawsuit alleging its Grok AI generated inappropriate images involving minors, raising serious concerns about AI safety and content controls.

Mar 21, 2026 - 08:39
 2
xAI faces a lawsuit over Grok AI allegedly generating inappropriate images of minors

Elon Musk’s AI company, xAI, is facing a lawsuit filed Monday in a California federal court, in which three anonymous plaintiffs argue that the company should be held responsible for allowing its AI systems to generate abusive sexual images involving identifiable minors.

The plaintiffs are seeking to pursue the case as a class action, representing individuals whose real images as minors were allegedly altered into explicit content using Grok. They claim that xAI failed to implement basic safeguards commonly used by other leading AI developers to prevent image-generation models from producing sexually explicit material involving real people, particularly minors.

The case — Jane Doe 1, Jane Doe 2 (a minor), and Jane Doe 3 (a minor) versus x.AI Corp. and x.AI LLC — has been filed in the U.S. District Court for the Northern District of California.

According to the complaint, other advanced image-generation systems use various technical measures to prevent the creation of exploitative content from ordinary photographs. The lawsuit alleges that xAI did not adopt such protections.

The filing further argues that once a system is capable of generating nude or erotic content from real images, it becomes extremely difficult to prevent it from producing similar material involving minors. The lawsuit also references Musk’s public promotion of Grok’s ability to generate sexualized imagery and portray real individuals in revealing ways.

One of the plaintiffs, identified as Jane Doe 1, alleges that Grok manipulated images from her high school homecoming and yearbook to depict her unclothed. She was reportedly alerted by an anonymous individual on Instagram, who informed her that the altered images were being circulated online and shared a link to a Discord server where sexualized images of her and other minors from her school were posted.

Another plaintiff, Jane Doe 2, was notified by criminal investigators that altered and sexualized images of her had been created through a third-party mobile application that uses Grok models. Similarly, Jane Doe 3 was informed by law enforcement after investigators found a manipulated explicit image of her on a device belonging to a suspect they had apprehended.

The plaintiffs’ legal team argues that xAI should still be held accountable even when the technology is used through third-party applications, since those systems rely on xAI’s underlying models, infrastructure, and code.

All three individuals — two of whom are still minors — state that they have experienced severe emotional distress due to the circulation of the altered images and the potential impact on their reputations and social lives. They are seeking civil penalties under multiple laws aimed at protecting minors from exploitation and addressing corporate negligence.

What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0
Shivangi Yadav Shivangi Yadav reports on startups, technology policy, and other significant technology-focused developments in India for TechAmerica.Ai. She previously worked as a research intern at ORF.