• Bobby Turkalino@lemmy.yachts
    link
    fedilink
    English
    arrow-up
    10
    ·
    10 months ago

    It would be as practical for good actors to simply state an image is generated in its caption, citation, or some other preexisting method. Good actors will retransmit this information, while bad actors will omit it, just like they’d remove the watermark. At least this way, no special software is required for the average person to check if an image is generated.

    Bing Image Creator already implements watermarks but it is trivially easy for me to download an image I generated, remove the watermark, and proceed with my ruining of democracy /s

    • Ook the Librarian@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      10 months ago

      I wasn’t thinking of like a watermark that is like anyone’s signature. More of a crypto signature most users couldn’t detect. Not a watermark that could be removed with visual effects. Something most people don’t know is there, like a printer’s signature for anti-counterfeiting.

      I don’t want to use the word blockchain, but some kind of way that if you want to take a fake video created by someone else, you are going to have a serious math problem on your hands to take away the fingerprints of AI. That way any viral video of unknown origin can easily be determined to be AI without any “look at the hands arguments”.

      I’m just saying, a solution only for good guys isn’t always worthless. I don’t actually think what I’m saying is too feasible. (Especially as written.) Sometimes rules for good guys only isn’t always about taking away freedom, but to normalize discourse. Although, my argument is not particularly good here, as this is a CA law, not a standard. I would like the issue at least discussed at a joint AI consortium.