A new report warns that the proliferation of child sexual abuse images on the internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos.
Even when conversion camp stuff is illegal, people still try it. Even if just DIY abuse (which is also illegal and still common). Simply not having the tech may be preferable to having the tech/knowledge, making it illegal to use it, and then people use it illegally anyways. If child abuse in general was something that could have just not been invented, not inventing it would be far superior to inventing it and then making it illegal.
We have the tech right now to shoot people dead. It's just tech, create laws about what's acceptable to do with it and what's not.
Even when conversion camp stuff is illegal, people still try it. Even if just DIY abuse (which is also illegal and still common). Simply not having the tech may be preferable to having the tech/knowledge, making it illegal to use it, and then people use it illegally anyways. If child abuse in general was something that could have just not been invented, not inventing it would be far superior to inventing it and then making it illegal.
oh yeah cause that's working out oh so perfectly
This thread is about how to regulate AI-generated child porn, that's going to require creating laws too.
What's your proposed alternative to using laws to regulate tech?
I was just saying that this tech that shoots people dead isn't really regulated all too well in certain parts of the world.
Well, mainly just one. There are lots of other places in the world that do okay at it.