Just out of curiosity. I have no moral stance on it, if a tool works for you I’m definitely not judging anyone for using it. Do whatever you can to get your work done!

  • diffuselight@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    That may have been their plan, but Meta fucked them from behind and released LLama which now runs on local machines, up to 30B parameter size and by end of the year will run at better than GPt3.5 ability on an iphone.

    Local llms, like airoboros, WizardLm, Stable Vicuña or Stable Coder are real alternatives in many domains.