2/17/2026 at 5:27:35 PM
I would not say if Grok has a real problem or not but the CCDH that did the study looks like to be a "scam". I don't know who fund them but they have clearly an agenda and would "manufacture" data however they can to support it.Title of the study and article says that Grok "Generated", but in fact:
> The CCDH then extrapolated
Basically they invent numbers.
They took a sample of 20k generated images, and it is assumed (but I don't know if the source is reliable) that Grok would have generated 4.6 millions image at the same time. So the sample is 00.4%.
If you see the webpage of the CCDH it is a joke their study. First:
- Images were defined as sexualized if they contain [...] a person in underwear, swimwear or similarly revealing clothing.
- Sexualized Images (Adults & Children): 12,995 found
- Sexualized Images (Likely Children): 101 found
First they invent their own definition, then adequately mixup possible "adult" pictures to give scary numbers.
by greatgib
2/17/2026 at 5:52:11 PM
What do you propose they do? Manually review every single image generated?Even if it’s “only” 1 million that would be a math task. Random sampling is the best we can do.
by MBCook
2/17/2026 at 10:00:03 PM
Not the person you are asking but I would require a better analyzer. It must be able to recognize children in sexual poses, children with exposed genitalia, children performing oral copulation or children being penetrated. If AI can be told to create a thing it should be able to identify that same thing. If Grok can not identify that which it was told to create that is potentially a bigger issue as someone may have nerfed that ability on purpose.There are psychological books on identifying signs of prepubescence based on facial and genitalia features that one can search for if they are in that line of work. Some of the former Facebook mods with PTSD know what I am referring to.
Leave everything else to manual flagging assuming Grok has a flag or report button that is easy to find. If not send links to these people [1] if in the USA.
[1] - https://www.ic3.gov/
by Bender
2/17/2026 at 5:34:36 PM
Right... So how much CSAM is an acceptable amount of CSAM in your opinion then?by dtj1123
2/17/2026 at 8:40:49 PM
A couple of things come to mind:1) Zero is basically never the best error rate, effort isn't infinite and spending too much of it on one defect ends up meaning spending less on other issues.
2) Look at what he's saying. This is a classic pattern for providing a fake proof of evil.
a) Point to evil. For example, CSAM
b) Expand the definition of that evil in ways that are often not even evil. Here, include scantily clad in your definition of "sexual". Note that swimsuits qualify.
c) Point to examples of evil in your expanded pool.
d) Claim this points to evidence of the original definition. Note that nothing about their claims precludes their "CSAM" being nothing more than ordinary beach or pool scenes. Their claim includes the null and when the null is a possible answer it should be assumed.
by LorenPechtel
2/17/2026 at 11:14:35 PM
To your point 2b, I would posit that it is also evil to sexualize adults against their consentby LocalH
2/18/2026 at 6:26:21 AM
I've asked how much lower the error rate should be in order to be acceptable, and you've then replied with a rebuttal to the message of the posted article.I agree that a zero error rate is generally not possible, although I think a company like Xitter can manage better than 101 in 20k.
by dtj1123
2/17/2026 at 5:59:51 PM
Who was abused here?by zb3
2/17/2026 at 9:00:41 PM
When you post on a public forum defending child pornography, it's maybe a good time to take a step back and evaluate your life.by riotnrrd
2/19/2026 at 11:05:31 AM
What a vile accusation for you to make based on zero substance. Perhaps it is you who needs to take a step back and evaluate your life.by JasonADrury
2/17/2026 at 6:44:04 PM
The environment.by autoexec
2/17/2026 at 6:38:30 PM
The people used as template faces and bodiesby chrystalkey
2/17/2026 at 9:01:47 PM
The future victims when the imagery stops being enough.by kakali
2/17/2026 at 10:29:04 PM
Has this been studied? I'm not following the topic, but without any evidence one could also say that availability of fake imagery might decrease demand for real imagery and therefore decrease the amount of abuse. But I'm not implying anything, just asking.by zb3