Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

  • lath@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the “AI” companies are generating a ton of csam and nobody is doing anything about it.

      • atomicorange@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 month ago

        If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?

        If so, how is the psychological effect of a convincing deepfake any different?

        • BombOmOm@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 month ago

          Taking secret nude pictures of someone is quite a bit different than…not taking nude pictures of them.

          It’s not CSAM to put a picture of someone’s face on an adult model and show it to your friend. It’s certainly sexual harassment, but it isn’t CSAM.

          • atomicorange@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            1 month ago

            How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?

            • BombOmOm@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 month ago

              It’s absolutely sexual harassment.

              But, to your question: you can’t just say something has underage nudity when the nudity is of an adult model. It’s not CSAM.

      • Lka1988@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        edit-2
        1 month ago

        Except, you know, the harassment and abuse of said deepfaked individual. Which is sexual in nature. Sexual harassment and abuse of a child using materials generated based on the child’s identity.

        Maybe we could have a name for it. Something like Child-based sexual harassment and abuse material… CSHAM, or maybe just CSAM, you know, to remember it more easily.

      • lath@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        1 month ago

        There’s a thing that was happening in the past. Not sure it’s still happening, due to lack of news about it. It was something called “glamour modeling” I think or an extension of it.

        Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.

        Nothing untoward directly happened to the children. They weren’t physically abused. They were treated as regular fashion models. And yet, it’s still csam. Why? Because of the intention behind making those pictures.

        The intention to exploit.