A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

  • Rodeo@lemmy.ca
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    What about making depictions of other crimes? Should depictions of theft be illegal? Depictions of murder?

    Why should depictions of one crime be made illegal, but depictions of other heinous crimes remain legal?

    • Jimmyeatsausage@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      1 year ago

      Because a picture of someone robbing my house doesn’t revictimize me. Even if it’s simulated, every time they run into some rando who recognizes them or every time a potential employer runs a background/social media check, it impacts the victim again

      • Fal@yiffit.net
        link
        fedilink
        English
        arrow-up
        11
        ·
        1 year ago

        A picture of a cartoon child having sex doesn’t victimize you either, the same way a drawing of a robbery doesn’t victimize you