A new report warns that the proliferation of child sexual abuse images on the internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos.
Oh, so you mean the photos in the training dataset that violate medical privacy
If they're published in a textbook then they're not private.
will not have given informed consent for those photos to be inserted into an AI model for child porn
Again, an AI model doesn't have to be created specifically for the purpose of child porn in order for it to be able to generate child porn. Most of these AI image models are very general purpose, they can create images of all kinds of things.
We're going in circles here and you're just getting angrier in your responses, I don't think this is headed anywhere useful at this point.
If they're published in a textbook then they're not private.
Again, an AI model doesn't have to be created specifically for the purpose of child porn in order for it to be able to generate child porn. Most of these AI image models are very general purpose, they can create images of all kinds of things.
We're going in circles here and you're just getting angrier in your responses, I don't think this is headed anywhere useful at this point.
Nope, this is headed nowhere useful, because we have a fundamentally different sense of ethics.