Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.
Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.
How is it different for the victim? What if they can’t tell if it’s a deepfake or a real photo of them?
It’s absolutely sexual harassment.
But, to your question: you can’t just say something has underage nudity when the nudity is of an adult model. It’s not CSAM.