• Empricorn@feddit.nl
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    This is tough. If it was just a sicko who generated the images for himself locally… that is the definition of a victimless crime, no? And it might actually dissuade him from seeking out real CSAM…

    BUT, iirc he was actually distributing the material, and even contacted minors, so… yeah he definitely needed to be arrested.

    But, I’m still torn on the first scenario…

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        Image-generating AI is capable of generating images that are not like anything that was in its training set.

          • GBU_28@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            If it has images of construction equipment and houses, it can make images of houses that look like construction equipment. Swap out vocabulary as needed.

            • xmunk@sh.itjust.works
              link
              fedilink
              arrow-up
              0
              ·
              4 months ago

              Cool, how would it know what a naked young person looks like? Naked adults look significantly different.

                • xmunk@sh.itjust.works
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  4 months ago

                  Is a kid just a 60% reduction by volume of an adult? And these are generative algorithms… nobody really understands how it perceives the world and word relations.

                  • FaceDeer@fedia.io
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    4 months ago

                    It understands young and old. That means it knows a kid is not just a 60% reduction by volume of an adult.

                    We know it understands these sorts of things because of the very things this whole kerfuffle is about - it’s able to generate images of things that weren’t explicitly in its training set.