A new report warns that the proliferation of child sexual abuse images on the internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos.
Points about real stuff hiding in a sea of fake stuff aside, because these ais would likely have been trained on images of real children and potentially real abuse material, each new generated image could be considered a re-exploitation of that child.
Of course, i don't think that's true in a legal sense but definitely in an emotional and moral sense. I mean look at the damage deepfakes have done to the mentals for so many celebrities and other victims, then imagine literally a minor trying to move past one of the most traumatic things that could have happened to them
I really don't think it would actually be trained on that specific data to be able to create it. If it can figure out a blueberry dog "child naked" seems pretty boring.
Points about real stuff hiding in a sea of fake stuff aside, because these ais would likely have been trained on images of real children and potentially real abuse material, each new generated image could be considered a re-exploitation of that child.
Of course, i don't think that's true in a legal sense but definitely in an emotional and moral sense. I mean look at the damage deepfakes have done to the mentals for so many celebrities and other victims, then imagine literally a minor trying to move past one of the most traumatic things that could have happened to them
I really don't think it would actually be trained on that specific data to be able to create it. If it can figure out a blueberry dog "child naked" seems pretty boring.