Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

  • atomicorange@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    26 days ago

    How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?

    • BombOmOm@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      26 days ago

      It’s absolutely sexual harassment.

      But, to your question: you can’t just say something has underage nudity when the nudity is of an adult model. It’s not CSAM.