Writing a 100-word email using ChatGPT (GPT-4, latest model) consumes 1 x 500ml bottle of water It uses 140Wh of energy, enough for 7 full charges of an iPhone Pro Max

  • douglasg14b@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    2 days ago

    Computational demands scale aggressively with model size.

    And if you want a response back in a reasonable amount of time you’re burning a ton of power to do so. These models are not fast at all.

    • teh7077@lemmy.today
      link
      fedilink
      English
      arrow-up
      17
      ·
      2 days ago

      Thanks for confirming my suspicion.

      So, the whole debate about “environmental impact of AI” is not about generative AI as such at all. Really comes down to people using disproportionally large models for simple tasks that could be done just as well by smaller ones, run locally. Or worse yet, asking a behemoth model like GPT-4 about something that could and should have been a simple search engine query, which I (subjectively) feel has become a trend in everyday tech usage…

      • OhNoMoreLemmy@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        It’s about generative AI as it is currently used.

        But yeah, the complaints everyone has about Gen AI are mostly driven by speculative venture capital. The only advantage Google and openai can maintain over open source models is a willingness to spend more per token than a hobbyist. So they’re pumping cash in to subsidize their LLMs and it carries with it a stupidly high environmental cost.

        There’s no possible end game here. Unlike the normal tech monopolies, you can’t put hobbiest models out of business, by subsidizing your own products. But the market is irrational and expects a general AI, and is encouraging this behavior.