The Grok Generation: The Consent Crisis No One Is Stopping

As AI image tools normalize the creation of nonconsensual sexualized images, young people are growing up in a digital world where consent can be erased with a single prompt.

A post by Elon Musk on the X app depicting Musk wearing a bikini
Elon Musk shared a Grok-generated image of himself in a bikini on X, joking about the tool as it faces criticism for enabling the creation of nonconsensual sexualized images of real people. (Leon Neal / Getty Images)

Grok, the AI chatbot used on Elon Musk’s platform X, is under fire for generating millions of nude or sexualized images of real people, including children. In one estimate, Grok produced one nonconsensual sexual image per minute over a 24-hour period. Prompts such as “put her in a transparent bikini” produced altered images that were then circulated publicly, some accumulating thousands of likes. The targets are real women and underage girls whose images were manipulated without their knowledge or permission.

Musk responded by making a joke, requesting a Grok-generated image of himself in a bikini and reacting with laughing emojis.

For girls growing up in this online environment, the message is unmistakable: … Your body can be altered, distributed and consumed for entertainment. Its violation can be dismissed as a joke. 

When the platform’s most powerful figure and one of the country’s most powerful men treats the abuse as a punchline, it sends a message about what is actually harmful versus what he thinks should be considered humor—and provides a tacit granting of consent to young men on the platform to keep making these images.

Much of the public conversation about young people and AI has focused on cheating in school or declining literacy. Far less attention has been paid to what it means when a middle school boy can type a sentence and produce a sexualized image of a female classmate in seconds as a joke or for attention—or to pretend he received it from her for status.

Across the country, boys as young as 10 and 11 have reportedly created and shared AI-generated nude images of girls in their schools. In one recent case, a 13-year-old girl was expelled after she physically confronted a boy who had generated and distributed explicit AI images of her. She had sought help from a guidance counselor and even law enforcement before the altercation. No meaningful intervention came. She was the one removed.

Deepfake tools do not simply generate images. They generate norms. 

Social media platforms define what is considered funny, acceptable, normal and cool. Social status is measured in likes and views. In a world where social media platforms can now create explicit images of young women at the touch of a button, these new tools require a change in the conversation around consent for young people.

AI-generated sexual abuse material is expanding at alarming speed worldwide, including the creation and alteration of child sexual abuse imagery.

The exterior of the building housing the offices of X, the social media platform owned by Musk, after a raid by police on Feb. 3, 2026, in Paris, France. The Paris prosecutor said that police focusing on cyber-crime searched the company’s offices in relation to an investigation into the content on X and its AI chatbot, Grok. (Pierre Suu / Getty Images)

For girls growing up in this online environment, the message is unmistakable: Your image is not protected as yours. Your body can be altered, distributed and consumed for entertainment. Its violation can be dismissed as a joke. 

When I was in middle and high school, I remember there being serious conversations about sending nude photos or forwarding someone else’s images without consent. But what happens when a generation grows up knowing that their classmates do not need an actual photo at all—that they can fabricate one in seconds? 

When a boy can manufacture an explicit image of a girl using nothing more than a sentence prompt, and face little to no consequence, the very definition of consent shifts. Teenagers today are not just navigating the risks of sharing images. They are navigating a world in which images of them can be created and weaponized without their participation or knowledge at all. That fundamentally changes the conversations we need to be having about privacy, power and bodily autonomy.

We already live in a world in which at least one in three women experiences physical or sexual violence in her lifetime. Technology did not invent misogyny or harassment, but artificial intelligence has dramatically increased the speed and scale at which abuse can occur while simultaneously making the action feel less harmful or real.

A poster picturing Musk with the tagline, “Who the fuck would want to use social media with a built-in child abuse tool?” unofficially installed by activist group Everyone Hates Elon, at a bus stop in London on Jan. 13, 2026. The U.K. communications regulator Ofcom launched a formal investigation into Musk’s social media platform X, regarding its AI chatbot, Grok. The probe centers on reports that Grok has been used to generate non-consensual sexual deepfakes, including “undressed” images of women and sexualized images of children. (Leon Neal / Getty Images)

A review of 20 leading AI image-generation platforms found that only seven required subjects to be over 18 in their terms of service, and even fewer enforced meaningful age verification. 

The development of AI tools are not just shaping the way a generation learns; it’s shaping the way a generation is socialized. The conversation about consent has already changed, whether we acknowledge it or not. Teachers, parents, lawmakers and platform leaders are behind. The question is not whether this will shape the next generation’s understanding of power and intimacy—but what we will step in to do about it. 

About

Haley Lickstein is a political and lifestyle creator, strategist, activist and public speaker. On a mission to empower young people—especially women—to shape the future, champion reproductive justice and drive meaningful change, Lickstein’s content focuses on bridging political conversations with cultural moments and covers the real impact current events have on young people. Her platforms reach a highly engaged Gen Z and millennial audience, with a combined following of 150k+. She holds a bachelor’s degree in political science from American University and lives in Washington, D.C.