| ▲ | MaxikCZ 3 days ago | |
I feel like theres a time in near future where LLMs will be too cautious to answer any questions they arent sure about, and most of the human effort will go into pleading the LLM to at least try to give an answer, which will almost always be correct anyways. | ||
| ▲ | plufz 3 days ago | parent | next [-] | |
That would be a great if you could have a setting like temperature 0.0-1.0 (Only answer if you are 100% to guess as much as you like). | ||
| ▲ | littlestymaar 3 days ago | parent | prev [-] | |
It's not going to happen as the user would just leave the platform. It would be better for most API usage though, as for business doing just a fraction of the job with 100% accuracy is often much preferable than claiming to do 100% but 20% is garbage. | ||