Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

  • wewbull@feddit.uk
    link
    fedilink
    English
    arrow-up
    2
    ·
    20 days ago

    Honestly I think we need to understand that this is no different to sticking a photo of someone’s head on a porn magazine photo. It’s not real. It’s just less janky.

    I would categorise it as sexual harassment, not abuse. Still serious, but a different level

    • lath@lemmy.world
      cake
      link
      fedilink
      English
      arrow-up
      1
      ·
      20 days ago

      Schools generally means it involves underage individuals, which makes any content using them csam. So in effect, the “AI” companies are generating a ton of csam and nobody is doing anything about it.

        • atomicorange@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          20 days ago

          If someone put a camera in the girls’ locker room and distributed photos from that, would you consider it CSAM? No contact would have taken place so the kids would be unaware when they were photographed, is it still abuse?

          If so, how is the psychological effect of a convincing deepfake any different?

          • BombOmOm@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            20 days ago

            Taking secret nude pictures of someone is quite a bit different than…not taking nude pictures of them.

            It’s not CSAM to put a picture of someone’s face on an adult model and show it to your friend. It’s certainly sexual harassment, but it isn’t CSAM.

            • atomicorange@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              20 days ago

              How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?

              • BombOmOm@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                20 days ago

                It’s absolutely sexual harassment.

                But, to your question: you can’t just say something has underage nudity when the nudity is of an adult model. It’s not CSAM.

        • Lka1988@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          edit-2
          20 days ago

          Except, you know, the harassment and abuse of said deepfaked individual. Which is sexual in nature. Sexual harassment and abuse of a child using materials generated based on the child’s identity.

          Maybe we could have a name for it. Something like Child-based sexual harassment and abuse material… CSHAM, or maybe just CSAM, you know, to remember it more easily.

        • lath@lemmy.world
          cake
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          20 days ago

          There’s a thing that was happening in the past. Not sure it’s still happening, due to lack of news about it. It was something called “glamour modeling” I think or an extension of it.

          Basically, official/legal photography studios took pictures of child models in swimsuits and revealing clothing, at times in suggestive positions and sold them to interested parties.

          Nothing untoward directly happened to the children. They weren’t physically abused. They were treated as regular fashion models. And yet, it’s still csam. Why? Because of the intention behind making those pictures.

          The intention to exploit.

    • LadyAutumn@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      edit-2
      20 days ago

      Yes, finding out that your peers have been sharing deep fake pornography of you is absolutely fine and a normal thing for young girls to go through in school. No girls have ever killed themselves because of this exact sort of thing, surely. This definitely will not add in any way to the way women and girls are made to feel entirely disgustingly dehumanized by every man or boy in their lives. Groups of men and boys reducing them and their bodies down to vivid sexual fantasies that they can quickly generate photo realistic images of.

      If the person in the image is underaged then it should be classified as child pornography. If the woman who’s photo is being used hasnt consented to this then it should be classified as sexual exploitation.

      Women and girls have faced degrees of this kind of sexual exploitation by men and boys since the latter half of the 20th century. But this is a severe escalation in that behavior. It should be illegal to do this and it should be prosecuted when and where it is found to occur.