| ▲ | hi_hi 3 days ago | |
The article seems well researched, has some good data, and is generally interesting. It's completely irrelevant to the reality of the situation we are currently in with LLMs. It's falling into the trap of assuming we're going to get to the science fiction abilities of AI with the current software architectures, and within a few years, as long as enough money is thrown at the problem. All I can say for certain is that all the previous financial instruments that have been jumped on to drive economic growth have eventually crashed. The dot com bubble, credit instruments leading to the global financial crisis, the crypto boom, the current housing markets. The current investments around AI that we're all agog at are just another large scale instrument for wealth generation. It's not about the technology. Just like VR and BioTech wasn't about the technology. That isn't to say the technology outcomes aren't useful and amazing, they are just independant of the money. Yes, there are Trillions (a number so large I can't quite comprehend it to be honest) being focused into AI. No, that doesn't mean we will get incomprehensible advancements out the other end. AGI isn't happening this round folks. Can hallucinations even be solved this round? Trillions of dollars to stop computers lying to us. Most people where I work don't even realise hallucinations are a thing. How about a Trillion dollars so Karen or John stop dismissing different viewpoints because a chat bot says something contradictory, and actually listen? Now that would be worth a Trillion dollars. Imagine a world where people could listen to others outside of their bubble. Instead they're being given tools that re-inforce the bubble. | ||
| ▲ | DanHulton 3 days ago | parent [-] | |
Indeed, this could be AI's fusion energy era, or AI's VR era, or even AI's FTL travel era. | ||