| ▲ | bee_rider 7 hours ago | |
LLMs are kind of fun to play with (this is a website for nerds, who among us doesn’t find a computer that talks back kind of fun), but I don’t really understand why people pay for these hosted versions. While the tech is still nascent, why not do a local install and learn how everything works? | ||
| ▲ | causalmodels 6 hours ago | parent | next [-] | |
Because my local is a laptop and doesn't have a GPU cluster or TPU pod attached to it. | ||
| ▲ | exe34 5 hours ago | parent | prev [-] | |
Claude code with opus is a completely different creature from aider with qwen on a 3090. The latter writes code. the former solves problems with code, and keeps growing the codebase with new features. (until I lose control of the complexity and each subsequent call uses up more and more tokens) | ||