New Zealand MP Laura McClure bravely exposed the dangers of deepfake technology by holding up a manipulated, naked image of herself in Parliament, highlighting the need for legislative change to protect victims of deepfake abuse.
Definitely fucking sick, but “fucking sick” is no way to run a society. The problem with child porn is that it can only be made by sexually abusing children, so without that factor you have to ask: Does AI generated child porn embolden or mollify pedophiles? Rigorous scientific research is necessary to produce an answer to that question, not kneejerk reactions.
AI generated anything relies on training data based on the real thing, so there’s no way to use a generator to “ethically” produce images of something unethical because it’s based on the unethical imagery. There’s no pathway out of the original abuse.
I would disagree. In the same vein as Nazi and imperial Japanese scientific experiments were poured over and used to further our understanding of human anatomy and the limits of the body as well as a host of other things. The original experiments were horrific to the extreme but it happened and simply destroying that data would help no one.
Those children have already been abused. That material already exists. Would it not make sense to use it to make a program that fulfills the desires of those who would do that sort of thing so that others do not need to be abused to produce it? It wouldn’t end it outright, but it seems like it would help.
There’s also AI generated CSAM now, and can be generated via prompt loopholes. Still fucking sick if you ask me…
Definitely fucking sick, but “fucking sick” is no way to run a society. The problem with child porn is that it can only be made by sexually abusing children, so without that factor you have to ask: Does AI generated child porn embolden or mollify pedophiles? Rigorous scientific research is necessary to produce an answer to that question, not kneejerk reactions.
AI generated anything relies on training data based on the real thing, so there’s no way to use a generator to “ethically” produce images of something unethical because it’s based on the unethical imagery. There’s no pathway out of the original abuse.
I would disagree. In the same vein as Nazi and imperial Japanese scientific experiments were poured over and used to further our understanding of human anatomy and the limits of the body as well as a host of other things. The original experiments were horrific to the extreme but it happened and simply destroying that data would help no one.
Those children have already been abused. That material already exists. Would it not make sense to use it to make a program that fulfills the desires of those who would do that sort of thing so that others do not need to be abused to produce it? It wouldn’t end it outright, but it seems like it would help.