| ▲ | cmpxchg8b 9 hours ago | |
8GB? What is this, an LLM for ants? | ||
| ▲ | kirurik 5 hours ago | parent | next [-] | |
You can run some models pretty decently using CPU inference only, things like Gemma 3 that are built for exactly that use case or some tiny speech to text models via llama.cpp that I have tested out (not so good). Although not the best for "heavy" tasks, if you just need a decent text generator that can produce more or less sensible, generic output you are good to go. | ||
| ▲ | matja 7 hours ago | parent | prev | next [-] | |
It's more about demonstrating what's possible on a Pi than expecting GPT-4 level performance. It's designed for LLMs that specialize in tiny, incredibly specific tasks. Like, "What's the weather in my ant farm?" ;) The vision processing boost is notable, but not enough to justify the price over existing HATs. The lack of reliable mixed-mode functionality and sparse software support are significant red flags. (This reply generated by an LLM smaller than 8GB, for ants, using the article and comment as context). | ||
| ▲ | mlvljr 8 hours ago | parent | prev [-] | |
[dead] | ||