A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

  • leaky_shower_thought@feddit.nl
    link
    fedilink
    arrow-up
    22
    ·
    1 year ago

    reading this, I don’t really know what is supposed to be protected here to be deemed possible of protections in the first place.

    closest reasonable one is the girl’s “identity”, so it could be fraud. but it’s not used to fool people. more likely, those getting the pics already consented this is ai generated.

    so might be defamation?

    the image generation tech is already easily accessible so the girl’s picture being easily accessible might be the weakest link?

      • DarkGamer@kbin.social
        link
        fedilink
        arrow-up
        7
        ·
        edit-2
        1 year ago

        Thanks for the valuable contribution to this discussion! It does appear this is a question of identity and personality rights, regarding how one wants to be portrayed.

        Reading that article though, it seems like that only applies to commercial purposes. If one is making deep fakes for their own non-commercial private use, it doesn’t appear personality rights apply.

      • Fal@yiffit.net
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        Pretty sure it’s illegal to create sexual images of children, photos or not.

        Maybe in your distopian countries where drawings are illegal. Absolutely absurd you’re promoting that as a good thing.

          • Fal@yiffit.net
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Yes but this thread is about just drawings in general. Deep faking someone into porn and spreading it around should absolutely be yourself. But it’s not “child porn”. It’s some type of harassment or defamation or something