| ▲ | ravenstine 6 hours ago | |
Though I think these companies are wildly overvalued, I don't see LLMs as a service going away in the future. The value in OpenAI is that it provides extra compute, data access, etc. My money is on local AI becoming more of a thing, while services like OpenAI still exist for local AIs to consult with. If a local model can somehow know that it's out of it's depth on a question/prompt, it can ask an OpenAI model if it's available, but otherwise still work locally if OpenAI fails to respond or goes out of business. To me that makes a lot more sense than the future being either-or. | ||
| ▲ | clhodapp 6 hours ago | parent [-] | |
Models not being able to reliably know if they are out of their depth is a foundational limitation of the currently generation of models, though. Best they can do is to somewhat reliably react to objective signals that they've failed at something (like test failures). | ||