| ▲ | bitwize a day ago | |
At my current work much of our software stack is based on GOFAI techniques. Except no one calls them AI anymore, they call it a "rules engine". Rules engines, like LLMs, used to be sold standalone and promoted as miracle workers in and of themselves. We called them "expert systems" then; this term has largely faded from use. This AI summer is really kind of a replay of the last AI summer. In a recent story about expert systems seen here on Hackernews, there was even a description of Gary Kildall from The Computer Chronicles expressing skepticism about AI that parallels modern-day AI skepticism. LLMs and CNNs will, as you describe, settle into certain applications where they'll be profoundly useful, become embedded in other software as techniques rather than an application in and of themselves... and then we won't call them AI. Winter is coming. | ||
| ▲ | wtetzner a day ago | parent [-] | |
Yeah, the problem with the term "AI" is that it's far too general to be useful. I've seen people argue that the goalposts keep moving with respect to whether or not something is considered AI, but that's because you can argue that a lot of things computers do are artificial intelligence. Once something becomes commonplace and well understood, it's not useful to communicate about it as AI. I don't think the term AI will "stick" to a given technology until AGI (or something close to it). | ||