| ▲ | attogram 2 days ago | |
"It works. That's annoying." Indeed! Would be cooler if support for local llms was added. Currently only has support for anthropic and openai. https://github.com/samrolken/nokode/blob/main/src/config/ind...  | ||
| ▲ | mikebelanger 2 days ago | parent [-] | |
Yeah that'd be really something. If you could just pay the cost up-front, rather than worry about how much every newer request cost, that really changes the game. There's still many other issues to worry about, like security. But as the author points out, we might be much closer than we think.  | ||