
His first instinct was to contact other victims he knew, and then “eventually, local law enforcement was contacted and a criminal case was filed,” the complaint said.
Investigating evidence from the Discord, police quickly determined that the perpetrator accessed the first victim’s Instagram account because he had a “close and friendly relationship” with her. Searching his phone, police found a third-party app that licensed or otherwise purchased access to Grok, which they concluded the perpetrator used to alter images of girls.
From there, the bad actor uploaded the images to a file-sharing platform called Mega and used them as a “bartering tool in Telegram group chats with hundreds of other users,” trading the AI CSAM files “for other minor sexual content.”
The lawsuit alleges severe emotional and mental distress and the damages suffered by the victims were extensive. It remains unclear whether the CSAM created by Grok for victims who knew the perpetrator was shared with classmates or distributed to others at their school, the lawsuit said. One girl fears the scandal will affect her college admissions, while another is too scared to attend her prom.
However, more frightening than any familiarity with AI CSAM is the fear that the girls will now be persecuted for Grok’s performances. As the lawsuit explains, “Also, victims’ real names and the name of their school were added to their online files, meaning they could be identified by other online predators, creating a significant risk of stalking.”
xAI is claimed to host the Grok CSAM
While Grok Imagine’s paid subscribers were previously reported to have created more graphic pitches than the Grok pitches that sparked the protests on X, the lawsuit alleges that xAI took other steps to hide how it was profiting from public content that harmed real people.




