| ▲ | dominicrose 6 hours ago | |
I understand wanting to run a local LLM for privacy, but on something more powerful. Yes it's costly but what use is an LLM on a cheap board? | ||
| ▲ | horsawlarway 2 hours ago | parent | next [-] | |
Depends entirely on the model you want to run. There are some absolutely useful things you can do with TTS/STT/Diarization/etc on even really minimal specs. Some of those will run fine on RPis even without this new hat. The extra ram probably opens the door to a large number of vision/image models, which typically want a minimum of 16Gb, but do better with 24/32. There are just a HUGE number of case specific models that do just fine on hardware at the RPi level, assuming you have the ram to load them. | ||
| ▲ | avhception 4 hours ago | parent | prev [-] | |
I was mostly responding to > [...] if I want some low power linux PC replacement with display output, for the price of the latest RPi 5, I can buy on the used market a ~2018 laptop I guess. I don't care about the AI hat at all. | ||