| ▲ | JKCalhoun 2 days ago | |
I propose that, for some software, the learning curve is becoming harder to surmount. Further, I'm suggesting "designed by people to be understood and used by people" might be a hurdle for some future software we might envision. (Altman's performance is orthogonal as I'm suggesting a new level of software that has not yet been written/conceived.) | ||
| ▲ | mossTechnician 21 hours ago | parent [-] | |
Regarding whether AI can/could overcome the hurdle of human understanding: I'm not sure if that's really a hurdle. Let's say in theory, a system was crafted by AI to be interacted with exclusively by AI. Broadly, I assume the outcome of the system would be for people, and it would have some purpose or value. Now my question is: how do we verify it functions? If it is a black box that nobody understands, then we can't verify it at all, and we can't debug it if there's something wrong with it. We circle back to the human understanding issue. (I'm sorry if my tangent about Altman was taken as a personal affront, as I did not mean it to be that. It just muddied the two interesting topics you brought up.) | ||