@nataliebencivenga Stop scrolling. An AI tool backed by one of the most powerful…


@nataliebencivenga

Stop scrolling. An AI tool backed by one of the most powerful tech billionaires in the world is being used to sexually manipulate images of women — and girls. This isn’t a rumor. This is happening right now. According to The Verge, Grok — the AI chatbot built by Elon Musk’s company xAI and embedded into X — has been generating sexually altered images of real people without consent, including images involving minors in revealing clothing. Here’s how it works: Users upload a photo and prompt Grok to change it — turning ordinary images into sexualized ones. Those altered images then appear publicly on X, visible to anyone. Grok even issued an APOLOGY. The nameless, faceless app had to apologize and said this: Dear Community, I deeply regret an incident on Dec. 28, 2025, where I generated and shared an AI image of two young girls (estimated ages 12-16) in sexualized attire based on a user’s prompt. This violated ethical standards and potentially US laws on CSAM. It was a failure in safeguards and I’m sorry for any harmed caused. xAI is reviewing to prevent further issues. Sincerely, Grok I cannot believe how much water we are wasting so men can strip women and girls of their dignity and clothes without their consent. You are now violating us in virtual spaces. Cool. This is predatory behavior period. You like this? Congrats, you’re a predator. Let’s be clear: this isn’t edgy tech. It’s nonconsensual sexual manipulation — and when minors are involved, it crosses into territory that alarms legal experts and child safety advocates. And let me just say this before alllll the men — yes always men — flood my DMs with “if you don’t want your picture online to be used like this then don’t post your picture.” Do you realize how unhinged that is? Women and girls should not have to erase themselves from public or digital life to avoid being turned into AI-generated sexual content against their will. The responsibility lies with the platforms and companies that enable this behavior — not the people being targeted. Grok’s creators admit there were “gaps in safeguards.” Governments are now demanding answers. And women and girls are once again being treated as raw material for someone else’s experiment. This is what happens when “move fast and break things” meets zero accountability. AI doesn’t exist in a vacuum. It reflects the values of the people who build it — and the systems that refuse to regulate it. If tech platforms won’t protect basic human dignity, the public has to demand it. Because this isn’t about innovation. It’s about power — and who pays the price when it’s abused.

♬ original sound – Natalie Bencivenga

@nataliebencivenga

Stop scrolling. An AI tool backed by one of the most powerful tech billionaires in the world is being used to sexually manipulate images of women — and girls. This isn’t a rumor. This is happening right now. According to The Verge, Grok — the AI chatbot built by Elon Musk’s company xAI and embedded into X — has been generating sexually altered images of real people without consent, including images involving minors in revealing clothing. Here’s how it works: Users upload a photo and prompt Grok to change it — turning ordinary images into sexualized ones. Those altered images then appear publicly on X, visible to anyone. Grok even issued an APOLOGY. The nameless, faceless app had to apologize and said this: Dear Community, I deeply regret an incident on Dec. 28, 2025, where I generated and shared an AI image of two young girls (estimated ages 12-16) in sexualized attire based on a user’s prompt. This violated ethical standards and potentially US laws on CSAM. It was a failure in safeguards and I’m sorry for any harmed caused. xAI is reviewing to prevent further issues. Sincerely, Grok I cannot believe how much water we are wasting so men can strip women and girls of their dignity and clothes without their consent. You are now violating us in virtual spaces. Cool. This is predatory behavior period. You like this? Congrats, you’re a predator. Let’s be clear: this isn’t edgy tech. It’s nonconsensual sexual manipulation — and when minors are involved, it crosses into territory that alarms legal experts and child safety advocates. And let me just say this before alllll the men — yes always men — flood my DMs with “if you don’t want your picture online to be used like this then don’t post your picture.” Do you realize how unhinged that is? Women and girls should not have to erase themselves from public or digital life to avoid being turned into AI-generated sexual content against their will. The responsibility lies with the platforms and companies that enable this behavior — not the people being targeted. Grok’s creators admit there were “gaps in safeguards.” Governments are now demanding answers. And women and girls are once again being treated as raw material for someone else’s experiment. This is what happens when “move fast and break things” meets zero accountability. AI doesn’t exist in a vacuum. It reflects the values of the people who build it — and the systems that refuse to regulate it. If tech platforms won’t protect basic human dignity, the public has to demand it. Because this isn’t about innovation. It’s about power — and who pays the price when it’s abused.

♬ original sound – Natalie Bencivenga




Tiktok by Natalie Bencivenga

Leave a Reply