Remix.run Logo
noodletheworld 7 hours ago

> Can I run a local LLM that allows me to control Home Assistant with natural language? Some basic stuff like timers, to do/shopping lists etc would be nice etc.

No. Get the larger PI recommended by the article.

Quote from the article:

> So power holds it back, but the 8 gigs of RAM holds back the LLM use case (vs just running on the Pi's CPU) the most. The Pi 5 can be bought in up to a 16 GB configuration. That's as much as you get in decent consumer graphics cards1.

> Because of that, many quantized medium-size models target 10-12 GB of RAM usage (leaving space for context, which eats up another 2+ GB of RAM).

> 8 GB of RAM is useful, but it's not quite enough to give this HAT an advantage over just paying for the bigger 16GB Pi with more RAM, which will be more flexible and run models faster.

The model specs shown for this device in the article are small, and not fit for purpose even for the relatively trivial use case you mentioned.

I mean, look, lots of people have lots of opinions about this (many of them wrong); it’s cheap, you can buy one and try… but, look. The OP really gave it a shot, and results were kind of shit. The article is pretty clear.

Don’t bother.

You want a device with more memory to mess around with for what you want to do.