| ▲ | qup 5 hours ago | |||||||
"maintain a home server" in this case roughly means "park a headless Mac mini (or laptop or RPi) on your desk" And you can use a local LLM if you want to eliminate the cloud dependency. | ||||||||
| ▲ | orsorna 5 hours ago | parent | next [-] | |||||||
You have spend tens of thousands of dollars on hardware to approach the reasoning and tool call levels of SOTA models...so, casually mentioning "just use local LLM" is out of reach for the common man. | ||||||||
| ||||||||
| ▲ | mystifyingpoi 4 hours ago | parent | prev [-] | |||||||
> And you can use a local LLM That ship has sailed a long time ago. It's of course possible, if you are willing to invest a few thousand dollars extra for the graphics card rig + pay for power. | ||||||||