| ▲ | afavour 2 hours ago | |
I think you're falling victim to survivorship bias there, or something like it. In 1940 I might have said "fusion power is possible" based entirely on what advanced psychics knowledge I had. And I would have been correct, according to the laws of physics it is possible. We still don't have it though. When watching Neil Armstrong walk on the moon I might have said "moon colonies are possible", and I'd have been right there too. And yet... | ||
| ▲ | ACCount37 an hour ago | parent [-] | |
Those two things are prevented by economics more than physics. For AI in particular, the economics currently favor ongoing capability R&D - and even if they didn't favor AI R&D directly (i.e. if ChatGPT and Stable Diffusion never happened), they would still favor making the computational inputs of AI R&D cheaper over time. Building advanced AIs is becoming easier and cheaper. It's just that the bar of "good enough" has gone off to space, and a "good enough" from 2020 is, nowadays, profoundly unimpressive. I'm not sure how much does it take to reach AGI. No one is sure of it. But the path there is getting shorter over time, clearly. And LLMs existing, improving and doing what they do makes me assume shorter AGI timelines, and call for a vote of no confidence on human exceptionalism. | ||