| ▲ | stavros 7 hours ago | |||||||
This is just the conversational interface issue. You need the system to be able to do most of the things you would expect a human to be able to do (e.g. if you're talking to your phone, you'd expect it to be able to do most phone things). If the conversational system can only do a small subset of those, then it just becomes a game of "discover the magical incantation that will be in the set of possibilities", and becomes an exercise in frustration. This is why LLMs are the first conversational interface to actually have a chance of working, once you give them enough tools. | ||||||||
| ▲ | s1mplicissimus 7 hours ago | parent [-] | |||||||
> once you give them enough tools are there solutions to the error rates when picking from dozens or even hundreds of tools i'm not aware of? | ||||||||
| ||||||||