Will AI soon surpass the human brain? If you ask employees at OpenAI, Google DeepMind and other large tech companies, it is inevitable. However, researchers at Radboud University and other institutes show new proof that those claims are overblown and unlikely to ever come to fruition. Their findings are published in Computational Brain & Behavior today.

  • utopiah@lemmy.ml
    link
    fedilink
    arrow-up
    56
    ·
    1 month ago

    It’s a classic BigTech marketing trick. They are the only one able to build “it” and it doesn’t matter if we like “it” or not because “it” is coming.

    I believed in this BS for longer than I care to admit. I though “Oh yes, that’s progress” so of course it will come, it must come. It’s also very complex so nobody else but such large entities with so much resources can do it.

    Then… you start to encounter more and more vaporware. Grandiose announcement and when you try the result you can’t help but be disappointed. You compare what was promised with the result, think it’s cool, kind of, shrug, and move on with your day. It happens again, and again. Sometimes you see something really impressive, you dig and realize it’s a partnership with a startup or a university doing the actual research. The more time passes, the more you realize that all BigTech do it, across technologies. You also realize that your artist friend did something just as cool and as open-source. Their version does not look polished but it works. You find a KickStarter about a product that is genuinely novel (say Oculus DK1) and has no link (initially) with BigTech…

    You finally realize, year after year, you have been brain washed to believe only BigTech can do it. It’s false. It’s self serving BS to both prevent you from building and depend on them.

    You can build, we can build and we can build better.

    Can we build AGI? Maybe. Can they build AGI? They sure want us to believe it but they have lied through their teeth before so until they do deliver, they can NOT.

    TL;DR: BigTech is not as powerful as they claim to be and they benefit from the hype, in this AI hype cycle and otherwise. They can’t be trusted.

    • just another dev@lemmy.my-box.dev
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 month ago

      It’s one thing to claim that the current machine learning approach won’t lead to AGI, which I can get behind. But this article claims AGI is impossible simply because there are not enough physical resources in the world? That’s a stretch.

      • utopiah@lemmy.ml
        link
        fedilink
        arrow-up
        6
        ·
        1 month ago

        I haven’t seriously read the article for now unfortunately (deadline tomorrow) but if there is one thing that I believe is reliable, it’s computational complexity. It’s one thing to be creative, ingenious, find new algorithms and build very efficient processors and datacenters to make things extremely efficient, letting us computer things increasingly complex. It’s another though to “break” free of complexity. It’s just, as far as we currently know, is impossible. What is counter intuitive is that seemingly “simple” behaviors scale terribly, in the sense that one can compute few iterations alone, or with a computer, or with a very powerful set of computers… or with every single existing computers… only to realize that the next iteration of that well understood problem would still NOT be solvable with every computer (even quantum ones) ever made or that could be made based on resources available in say our solar system.

        So… yes, it is a “stretch”, maybe even counter intuitive, to go as far as saying it is not and NEVER will be possible to realize AGI, but that’s what their paper claims. It’s a least interesting precisely because it goes against the trend we hear CONSTANTLY pretty much everywhere else.

      • MindTraveller@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        Maybe if they keep using digital computers. What they need is an analogue system. It’s much more efficient for this kind of work.

    • yonder@sh.itjust.works
      link
      fedilink
      arrow-up
      7
      ·
      1 month ago

      And the big tech companies also stand to benefit from overhyping their product to the point of saying it will take over the world. They look better for investors and can justify laws saying they should be the only arbiters of this technology to “keep it out of criminal hands” while happily serving the criminals for a fee.