2024 might be the breakout year for efficient ARM chips in desktop and laptop PCs.

  • R0cket_M00se@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    10 months ago

    “The most exciting tech isn’t the thing that currently exists and is being improved and integrated daily, it’s this other thing we don’t even know for sure will maybe happen.”

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      10 months ago

      “Forget about the possibility that we may finally have developed machines that think, that comprehend the world in a way similar to how humans do and can communicate with us on our level. This new chip design might end up with comparable capabilities to the existing chip design!”

      Yeah, there was no need to try to hype this up as the biggest thing ever.

      • richieadler@lemmy.myserv.one
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 months ago

        Forget about the possibility that we may finally have developed machines that think, that comprehend the world in a way similar to how humans do and can communicate with us on our level.

        That isn’t what’s happening with “IA” right now.

        • R0cket_M00se@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          10 months ago

          You clearly don’t work in a field where it’s gutting swaths through workflows and taking up serious slack.

          You can describe your problem to it in native English, so it does communicate on our level. It comprehends training data in the same way a human comprehends our lived experience and assimilates the data in the same manner. It’s not truly “reasoning”, but it’s leagues ahead of anything we had even four years ago and it’s only going to grow from here.

          Commercial ventures are finding new uses cases everyday and to people in IT it’s hilarious in the same way that people who thought the Internet was a fad were hilarious.

            • R0cket_M00se@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              10 months ago

              I literally said “it’s not truly reasoning” to clarify that while it’s drawing on its training data in the same way you draw on your experiences when making new decisions, it can’t really create original thought.

              Once again lemmy proves reading comprehension is too damn hard.

              • richieadler@lemmy.myserv.one
                link
                fedilink
                English
                arrow-up
                2
                ·
                10 months ago

                I thought you were the one saying IA thinks, it was someone else. Apologies for that.

                OTOH you can take your sarcasm and insert it rectally.

    • corbin@infosec.pubOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 months ago

      Right, it’s less exciting now because it’s already here. I’m not expecting radically improved GPT models or whatever in 2024, probably just more iteration. The most exciting stuff there might be local AI tech becoming more usable, like we’ve seen with stable diffusion.

      • sir_reginald@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        I’m just expecting performance optimisations, especially for local LLMs. Right now there are models as good as GPT-4 (Goliath 120B), but they require 2 RTX 4090 to run.

        The models that require less powerful equipment are not as good, of course.

        But hopefully, given enough time, good enough models will be able to run with mid end hardware.