• basmati@lemmus.org
      link
      fedilink
      English
      arrow-up
      25
      ·
      edit-2
      21 days ago

      It’s really not. It’s a fad like zip disks or ewaste recycling. Only it’s even more expensive while reducing productivity and quality of work everywhere it’s implemented, all for the vague hope it eventually might get better.

      • laranis@lemmy.zip
        link
        fedilink
        English
        arrow-up
        8
        ·
        21 days ago

        How dare you besmirch the good name of zip disks! There was a good 18 month period in the nineties where they filled a valid use case in the gap between floppy disks and the widespread instantiation of WAN solutions for moving and storing data.

      • asmodee59@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        21 days ago

        I would be really annoyed if it was just a fad seeing as it makes me save at least an hour of work a day.

      • ᵀʰᵉʳᵃᵖʸᴳᵃʳʸ@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        3
        ·
        21 days ago

        This is such a weak take. It’s constantly getting more efficient, and it’s already extremely helpful- It’s been incorporated into countless applications. OpenAI might go away, but llms and genai won’t. I run an open source local llm to automate most of my documentation workflow, and that’s not going away

        • Voroxpete@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          9
          ·
          20 days ago

          “it’s been incorporated into countless applications”

          I think the phrasing you were looking for there was “hastily bolted onto.” Was the world actually that desperate for tools to make bad summaries of data, and sometimes write short form emails for us? Does that really justify the billions upon billions of dollars that are being thrown at this technology?

          • Zos_Kia@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            20 days ago

            This comment shows you have no idea of what is going on. Have fun in your little bubble, son.

      • dependencyinjection@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        3
        ·
        21 days ago

        Do you think AI and / or AGI is a possibly at all given enough time?

        Because if the answer is yes, then don’t we need people working on it all the time to keep inching towards that? I’m not saying that the current implementations are anywhere close, but they do have their use cases. I’m a software developer and my boss the lead engineer (the smartest person I’ve ever met) has made some awesome tools tools that save our company of 7 people maybe a 100 hours of work a month.

        People used to complain about the LHC and that’s made countless discoveries that help in other fields.

        • basmati@lemmus.org
          link
          fedilink
          English
          arrow-up
          5
          ·
          21 days ago

          LLMs and GANs in general are to AI and AGI like a hand pumped well is to the ISS. Sure, they both technological marvels of their time, but if you’re wanting to float in microgravity there is no possible adjustment you can make to the former to get it to actually behave like the latter.

        • Voroxpete@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          ·
          20 days ago

          Powered flight was an important goal, but that wouldn’t have justified throwing all the world’s resources at making Da Vinci’s flying machine work. Some ideas are just dead ends.

          Transformer based generative models do not have any demonstrable path to becoming AGI, and we’re already hitting a hard ceiling of diminishing returns on the very limited set of things that they actually can do. Developing better versions of these models requires exponentially larger amounts of data, at exponentially scaling compute costs (yes, exponentially… To the point where current estimates are that there literally isn’t enough training data in the world to get past another generation or two of development on these things).

          Whether or not AGI is possible, it has become extremely apparent that this approach is not going to be the one that gets us there. So what is the benefit of continuing to pile more and more resources into it?

    • omarfw@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      20 days ago

      LLMs are not AI. They’re content stealing blenders wearing a name tag that says AI on it.