lidd1ejimmy@lemmy.ml to Memes@lemmy.mlEnglish · 4 months agoOffline version of Chat GPTlemmy.mlimagemessage-square27fedilinkarrow-up1501
arrow-up1501imageOffline version of Chat GPTlemmy.mllidd1ejimmy@lemmy.ml to Memes@lemmy.mlEnglish · 4 months agomessage-square27fedilink
minus-squareneidu2@feddit.nllinkfedilinkarrow-up47·edit-24 months agoTechnically possible with a small enough model to work from. It’s going to be pretty shit, but “working”. Now, if we were to go further down in scale, I’m curious how/if a 700MB CD version would work. Or how many 1.44MB floppies you would need for the actual program and smallest viable model.
minus-squareNaz@sh.itjust.workslinkfedilinkarrow-up16·4 months agosquints That says , “PHILLIPS DVD+R” So we’re looking at a 4.7GB model, or just a hair under the tiniest, most incredibly optimized implementation of <INSERT_MODEL_NAME_HERE>
minus-squarecurbstickle@lemmy.dbzer0.comlinkfedilinkarrow-up13·4 months agollama 3 8b, phi 3 mini, Mistral, moondream 2, neural chat, starling, code llama, llama 2 uncensored, and llava would fit.
minus-squareBudgetBandit@sh.itjust.workslinkfedilinkarrow-up1·4 months agoJust interested in the topic did you 🔨 offline privately?
minus-squarecurbstickle@lemmy.dbzer0.comlinkfedilinkarrow-up1·4 months agoI’m not an expert on them or anything, but feel free
minus-squareNoiseColor @lemmy.worldlinkfedilinkarrow-up14·4 months agoMight be a dvd. 70b ollama llm is like 1.5GB. So you could save many models on one dvd.
minus-squareerrer@lemmy.worldlinkfedilinkEnglisharrow-up8·4 months agoIt is a DVD, can faintly see DVD+R on the left side
minus-squareIgnotum@lemmy.worldlinkfedilinkarrow-up8·4 months ago70b model taking 1.5GB? So 0.02 bit per parameter? Are you sure you’re not thinking of a heavily quantised and compressed 7b model or something? Ollama llama3 70b is 40GB from what i can find, that’s a lot of DVDs
minus-squareNoiseColor @lemmy.worldlinkfedilinkarrow-up9·4 months agoAh yes probably the Smaler version, your right. Still, a very good llm better than gpt 3
minus-square9point6@lemmy.worldlinkfedilinkarrow-up7·4 months agoLess than half of a BDXL though! The dream still breathes
minus-squareSteve@startrek.websitelinkfedilinkarrow-up5·4 months agoFor some reason, triple layer writable blu-ray exists. 100GB each https://www.verbatim.com/prod/optical-media/blu-ray/bd-r-xl-tl/bd-r-xl-tl/
minus-squareDannyBoy@sh.itjust.workslinkfedilinkEnglisharrow-up5·4 months agoIt does have the label DVD-R
minus-squarekindenough@kbin.earthlinkfedilinkarrow-up8·edit-24 months agoMaybe not all that LLM, https://en.wikipedia.org/wiki/ELIZA
minus-squareNum10ck@lemmy.worldlinkfedilinkEnglisharrow-up7·4 months agoELIZA was pretty impressive for the 1960s, as a chatbot for psychology.
minus-squarelidd1ejimmy@lemmy.mlOPlinkfedilinkEnglisharrow-up4·4 months agoyes i guess it would be a funny experiment for just a local model
minus-squareveroxii@aussie.zonelinkfedilinkarrow-up3·4 months agopkzip c:\chatgpt*.* a:\chatgpt.zip -&
Technically possible with a small enough model to work from. It’s going to be pretty shit, but “working”.
Now, if we were to go further down in scale, I’m curious how/if a 700MB CD version would work.
Or how many 1.44MB floppies you would need for the actual program and smallest viable model.
squints
That says , “PHILLIPS DVD+R”
So we’re looking at a 4.7GB model, or just a hair under the tiniest, most incredibly optimized implementation of <INSERT_MODEL_NAME_HERE>
llama 3 8b, phi 3 mini, Mistral, moondream 2, neural chat, starling, code llama, llama 2 uncensored, and llava would fit.
Just interested in the topic did you 🔨 offline privately?
I’m not an expert on them or anything, but feel free
Might be a dvd. 70b ollama llm is like 1.5GB. So you could save many models on one dvd.
It is a DVD, can faintly see DVD+R on the left side
70b model taking 1.5GB? So 0.02 bit per parameter?
Are you sure you’re not thinking of a heavily quantised and compressed 7b model or something? Ollama llama3 70b is 40GB from what i can find, that’s a lot of DVDs
Ah yes probably the Smaler version, your right. Still, a very good llm better than gpt 3
Less than half of a BDXL though! The dream still breathes
For some reason, triple layer writable blu-ray exists. 100GB each
https://www.verbatim.com/prod/optical-media/blu-ray/bd-r-xl-tl/bd-r-xl-tl/
It does have the label DVD-R
Maybe not all that LLM, https://en.wikipedia.org/wiki/ELIZA
ELIZA was pretty impressive for the 1960s, as a chatbot for psychology.
yes i guess it would be a funny experiment for just a local model
pkzip c:\chatgpt*.* a:\chatgpt.zip -&