New Mexico is seeking an injunction to permanently block Snap from practices allegedly harming kids. That includes a halt on advertising Snapchat as “more private” or “less permanent” due to the alleged “core design problem” and “inherent danger” of Snap’s disappearing messages. The state’s complaint noted that the FBI has said that “Snapchat is the preferred app by criminals because its design features provide a false sense of security to the victim that their photos will disappear and not be screenshotted.”

  • some_guy@lemmy.sdf.orgOP
    link
    fedilink
    arrow-up
    2
    ·
    2 months ago

    The article didn’t say the cops generated AI CSAM. It said they created a profile pic, which was shown in the article.

    • Grandwolf319@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      2 months ago

      So if someone generates a minor’s image and it’s not nude, is that not CSAM?

      I’m genuinely asking, I always thought it was about sexualizing children, not whether they are nude or not.

      • some_guy@lemmy.sdf.orgOP
        link
        fedilink
        arrow-up
        2
        ·
        2 months ago

        I don’t think so. People keep throwing that acronym around but I suspect they didn’t read the article and find out that it was one normal picture of a high school-aged girl.

        • Grandwolf319@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          2 months ago

          I actually read it and then made a comment because even though it’s a profile picture, the intent is to have a viewer sexual the picture and thereby sexualizing a minor.

          I do get how it’s a normal picture, but it made me think of this slippery slope and where the line is.