Remix.run Logo
nucleardog 7 hours ago

Hey now! I've got a half terabyte of RAM at my disposal! I mean, it's DDR4 but... it's RAM!

And it's paired with 48 processor cores! I mean, they don't even support AVX512 but they can do math!

I could totally train a LLM! Or at least my family could... might need my kid to pick up and carry on the project.

But in all seriousness... you either missed the point, are being needlessly pedantic, or are... wrong?

This is about learning concepts, and the rest of this is mostly moot.

On the pedantic or wrong notes--What is the documented cut-off for a "large" language model? Because GPT-2 was and is described as a "large" language model. It had 1.5B parameters. You can just about get a consumer GPU capable of training that for about $400 these days.

baalimago 5 hours ago | parent | next [-]

Yeah it's just a semantic pet peeve. Let me ask you this: What is a "Language Model", if this is a "Large Language Model"? Inversely, if a 1.5B model is "Large" then what is the recent 1T param models? "Superlarge"?

In my own very humble opinion, it becomes "Large" when it's out of non-specialized hardware. So currently, a model which requires more than 32GB vram is large (as that's roughly where the high-end gaming GPUs cut off).

And btw, there is no way you can train a language model on a CPU, even with ddr5, lest you wait a whole week for a single training cycle. Give it a go! I know I did, it's a magnitude away from being feasible.

joefourier 39 minutes ago | parent [-]

Calling anything "large" in computing is problematic since hardware keeps improving. GPT-1 was an LLM in 2017 and had 117M parameters, when did it stop being large?

GPT would have been a better term than LLM, but unfortunately became too associated with OpenAI. And then, what about non-transformer LLMs? And multimodal LLMs?

Maybe we should just give up, shrug and call it "AI".

Malcolmlisk 6 hours ago | parent | prev [-]

Then rewrite the title and call it "learn how to do a non usable llm from scratch"

improbableinf 6 hours ago | parent [-]

Opus 4.7 is non-usable for the tasks I have — but it’s considered an LLM.

And no one is stopping anyone from tweaking few parameters in this repo to go above 10M parameters.

skinfaxi an hour ago | parent [-]

What tasks is it non-usable for?