• Flying Squid@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    146
    ·
    8 months ago

    Stalin famously ordered people he had killed erased from photos.

    Imagine what current and future autocratic regimes will be able to achieve when they want to rewrite their histories.

      • Carrolade@lemmy.world
        link
        fedilink
        English
        arrow-up
        26
        ·
        8 months ago

        Probably just because some people really like Stalin, and have become convinced his accounts are the truthful ones and everyone else lies about him.

        • Buffalox@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          8 months ago

          That’s a scary thought!! But all kinds of crazy exist, and I mean people have to be literally crazy to want to live under a regime like Stalin made.

      • fuckwit_mcbumcrumble@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        ·
        8 months ago

        “Photoshopping” something bad existed for a long time at this point.AI generated images doesn’t really change anything other then the entire photo being fake instead of just a small section.

        • TheFriar@lemm.ee
          link
          fedilink
          English
          arrow-up
          23
          ·
          8 months ago

          I’d disagree. It takes, now, zero know-how to convincingly create a false image. And it takes zero work. So where one photo would take one person a decent amount of time to convincingly pull off, now one person can create 100 images or more in that time, each one a potential time bomb that will go off when it starts getting passed around as evidence of something. And there are uncountable numbers of bad actors on the internet trying to cause a ruckus. This just increased their chances of succeeding at least 100-fold, and opened the access to many, many others who might just do it accidentally, for a joke, or who always wanted to create waves but didn’t have the photoshop skills necessary.

          • Aniki 🌱🌿@lemm.ee
            link
            fedilink
            English
            arrow-up
            4
            ·
            8 months ago

            Yeah some of these would be like 100 layer creations if someone was doing it themselves in photoshop – It would take a professional or near-professional level of skills.

        • uienia@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          8 months ago

          The easy and speed with which AI created photos, of a quality most photoshoppers could only dream, can be created of does very much change everything.

      • StarkWolf@kbin.social
        link
        fedilink
        arrow-up
        9
        ·
        8 months ago

        With AI video also getting increasingly impressive and believable, I worry that we will soon live in a world where you could have actual video evidence of a murder, and that evidence being dismissed or cast into doubt because of how easy, or supposedly how easy it would be to fake.

        • Buffalox@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 months ago

          Absolutely, only video from trusted sources can be used. But isn’t that already the case?

          • StarkWolf@kbin.social
            link
            fedilink
            arrow-up
            2
            ·
            8 months ago

            I think they are both equally scary. I’m imagining cases where photo and video evidence have played major roles in proving police abuses of power for example. We will certainly have an onslaught of people making faking evidence of all sorts of things to push a political narrative, but equally in any politicized narrative, any politically inconvenient photos or videos of real things that really happened might be swept under the rug as “someone probably just faked that for political gain.” Sure you could have an investigation to look into the authenticity of the evidence, or look at other forensic evidence, but probably only if you can afford to have such an investigation done, or enough public attention gets drawn to it. I fear we are reaching a scary time where, in a sense, reality will be whatever people want it to be, and we will increasingly be unable to trust anything we see as real with absolute certainty. We have been headed down this road for a very long time, but this will just make it much worse

    • magic_lobster_party@kbin.run
      link
      fedilink
      arrow-up
      11
      ·
      edit-2
      8 months ago

      Digital image editing has been really good for this kind of stuff for quite a while. Now it’s even easier with content aware fill.

      Unless you’re the PR manager for the British Royal family. Then you somehow lack the basic skills to make convincing edits.

    • Cosmic Cleric@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      8 months ago

      Honestly, it looks like the picture on the left is fake, like the guy was inserted into it. Just look at his outline, compared with the rest of the background.

      (I’m no Stalin fan, just commenting on the picture itself.)

    • smileyhead@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      3
      ·
      8 months ago

      I can Imagine such regimes novadays to develop some sort of cryptographic photo attestation, so any photo not signed by them is going to be shown as untrusted, regardless if it’s fake or not. And all the code from processor to camera app would need to be approved by their servers in order to get a sign.

      Oh wait! Our great friends at Adobe, Intel, Google and Microsoft are already working on just that: https://c2pa.org/