• massive_bereavement@fedia.io
    link
    fedilink
    arrow-up
    20
    ·
    23 hours ago

    I am not saying this shouldn’t be illegal, as distributing it hurts the victim, but CSAM is proof that a child has been abused. I think those are two very different things.

      • NoneOfUrBusiness@fedia.io
        link
        fedilink
        arrow-up
        9
        arrow-down
        1
        ·
        20 hours ago

        Definitely fucking sick, but “fucking sick” is no way to run a society. The problem with child porn is that it can only be made by sexually abusing children, so without that factor you have to ask: Does AI generated child porn embolden or mollify pedophiles? Rigorous scientific research is necessary to produce an answer to that question, not kneejerk reactions.

        • jaredwhite@piefed.social
          link
          fedilink
          English
          arrow-up
          4
          ·
          15 hours ago

          AI generated anything relies on training data based on the real thing, so there’s no way to use a generator to “ethically” produce images of something unethical because it’s based on the unethical imagery. There’s no pathway out of the original abuse.

          • HasturInYellow@lemmy.world
            link
            fedilink
            arrow-up
            4
            ·
            14 hours ago

            I would disagree. In the same vein as Nazi and imperial Japanese scientific experiments were poured over and used to further our understanding of human anatomy and the limits of the body as well as a host of other things. The original experiments were horrific to the extreme but it happened and simply destroying that data would help no one.

            Those children have already been abused. That material already exists. Would it not make sense to use it to make a program that fulfills the desires of those who would do that sort of thing so that others do not need to be abused to produce it? It wouldn’t end it outright, but it seems like it would help.