| ▲ | featherless a day ago | ||||||||||||||||||||||
Except most of those services don't have at-home equivalents that you can increasingly run on your own hardware. | |||||||||||||||||||||||
| ▲ | oceanplexian a day ago | parent | next [-] | ||||||||||||||||||||||
I run models with Claude Code (Using the Anthropic API feature of llama.cpp) on my own hardware and it works every bit as well as Claude worked literally 12 months ago. If you don't believe me and don't want to mess around with used server hardware you can walk into an Apple Store today, pick up a Mac Studio and do it yourself. | |||||||||||||||||||||||
| |||||||||||||||||||||||
| ▲ | a day ago | parent | prev [-] | ||||||||||||||||||||||
| [deleted] | |||||||||||||||||||||||