A new report warns that the proliferation of child sexual abuse images on the internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos.
The issue here is that it enables those who would make the actual CP to hide their work easier in the flood of generated content.
Animesque art is one thing, photorealistic is another. Neither actually harms an underaged person by existing, but photorealistic enables actual abusers to hide themselves easily. So IMO, photorealistic 'art' of this sort needs to be criminalized so that it can not be used as a mask for actual CP.
The issue here is that it enables those who would make the actual CP to hide their work easier in the flood of generated content.
Animesque art is one thing, photorealistic is another. Neither actually harms an underaged person by existing, but photorealistic enables actual abusers to hide themselves easily. So IMO, photorealistic 'art' of this sort needs to be criminalized so that it can not be used as a mask for actual CP.
If their work gets flooded then wouldn't that decrease incentive to produce it?