▲ | varispeed 10 months ago | ||||||||||||||||
I use ChatGPT and Claude daily, but I can't see a use case for why would I use LLM outside of these services. What do you use Llama.cpp for? I get you can ask it a question in natural language and it will spit out sort of an answer, but what would you do with it, what do you ask it? | |||||||||||||||||
▲ | anon373839 10 months ago | parent | next [-] | ||||||||||||||||
You can run a model with substantially similar capabilities to Claude or ChatGPT locally, with absolute data privacy guaranteed. Whereas with Claude or ChatGPT, all you can do is trust and hope they won’t use your data against you at some point in the future. If you’re more technically minded, you can hack on the model itself, the sampling method, etc., and have a level of fine-grained control over the technology that isn’t possible with a cloud model. | |||||||||||||||||
| |||||||||||||||||
▲ | SteelPh0enix 10 months ago | parent | prev [-] | ||||||||||||||||
I use llama.cpp mostly for working with code that i can't share with any online provider. Simple NDA stuff. Some refactors are easier to do via LLM than manually. It's a decent debugging duck too. | |||||||||||||||||
|