| ▲ | stingraycharles a day ago | |
Well yes, but asking the model to ask questions to resolve ambiguities is critical if you want to have any success in eg a coding assistant. There are shitloads of ambiguities. Most of the problems people have with LLMs is the implicit assumptions being made. Phrased differently, telling the model to ask questions before responding to resolve ambiguities is an extremely easy way to get a lot more success. | ||