Child Pornography Found in AI Training Material: Stanford Report
Stanford researchers have found child pornography images in material used to train popular AI image generators, with the illegal images been identified since April. The Stanford Internet Observatory discovered over 1,000 fake child sexual abuse images in a...
This article is saying the CP is in the original training material, which then allowed the model to produce many more fake images of CP. And they're talking about creating filters to prevent CP from ever making into the training material to be produced in the future.
At least, that's what the article is claiming.
Until people start paying a super-painful price for their paedo depravity, and as long as their pleasure center supercedes their fear center (due to lack of consequences), it's never going to end.
The punitive measures need to be more severe and fearsome than that of any other kind of offense on the planet, and being caught and brought to justice for paedo-crimes needs to be the most paramount of all other fears man can possibly comprehend.
They MUST be boxed in and forced into check.......and they don't generally fear the courts.
Credible fear of swift and violent retribution / retaliation / vigilantism by the populace is the only way.
Hang'em, burn'em.......shoot their ashes off-planet, toward the sun. GTFOH.
In this vein I think another issue is that they seem to just stop at the end-user level without tracing the CP up the chain to find the original molestor / photographer and those involved with distributing it onto the internet (if a foreign country they could still do it and involve their local authorities)
If they don't trace it back they're basically pissing into the wind