| ▲ | heavyset_go 2 days ago | |||||||
Home calculators are cheap as they've ever been, but this era of computing is out of reach for the majority of people. The analogous PC for this era requires a large amount of high speed memory and specialized inference hardware. | ||||||||
| ▲ | dghlsakjg 2 days ago | parent | next [-] | |||||||
What regular home workload are you thinking of that the computer I described is incapable of? You can call a computer a calculator, but that doesn’t make it a calculator. Can they run SOTA LLMs? No. Can they run smaller, yet still capable LLMs? Yes. However, I don’t think that the ability to run SOTA LLMs is a reasonable expectation for “a computer in every home” just a few years into that software category even existing. | ||||||||
| ||||||||
| ▲ | atonse 2 days ago | parent | prev | next [-] | |||||||
You can have access to a supercomputer for pennies, internet access for very little money, and even an m4 Mac mini for $500. You can have a raspberry pi computer for even less. And buy a monitor for a couple hundred dollars. I feel like you’re twisting the goalposts to make your point that it has to be local compute to have access to AI. Why does it need to be local? Update: I take it back. You can get access to AI for free. | ||||||||
| ▲ | platevoltage 2 days ago | parent | prev [-] | |||||||
No it doesn't. The majority of people aren't trying to run Ollama on their personal computers. | ||||||||