| ▲ | andy_ppp 12 hours ago | ||||||||||||||||||||||
So is the translation endless scaling has stopped being as effective? | |||||||||||||||||||||||
| ▲ | Animats 11 hours ago | parent | next [-] | ||||||||||||||||||||||
It's stopped being cost-effective. Another order of magnitude of data centers? Not happening. The business question is, what if AI works about as well as it does now for the next decade or so? No worse, maybe a little better in spots. What does the industry look like? NVidia and TSMC are telling us that price/performance isn't improving through at least 2030. Hardware is not going to save us in the near term. Major improvement has to come from better approaches. Sutskever: "I think stalling out will look like…it will all look very similar among all the different companies. It could be something like this. I’m not sure because I think even with stalling out, I think these companies could make a stupendous revenue. Maybe not profits because they will need to work hard to differentiate each other from themselves, but revenue definitely." Somebody didn't get the memo that the age of free money at zero interest rates is over. The "age of research" thing reminds me too much of mid-1980s AI at Stanford, when everybody was stuck, but they weren't willing to admit it. They were hoping, against hope, that someone would come up with a breakthrough that would make it work before the house of cards fell apart. Except this time everything costs many orders of magnitude more to research. It's not like Sutskever is proposing that everybody should go back to academia and quietly try to come up with a new idea to get things un-stuck. They want to spend SSI's market cap of $32 billion on some vague ideas involving "generalization". Timescale? "5 to 20 years". This is a strange way to do corporate R&D when you're kind of stuck. Lots of little and medium sized projects seem more promising, along the lines of Google X. The discussion here seems to lean in the direction of one big bet. You have to admire them for thinking big. And even if the whole thing goes bust, they probably get to keep the house and the really nice microphone holder. | |||||||||||||||||||||||
| |||||||||||||||||||||||
| ▲ | jsheard 11 hours ago | parent | prev | next [-] | ||||||||||||||||||||||
The translation is that SSI says that SSIs strategy is the way forward so could investors please stop giving OpenAI money and give SSI the money instead. SSI has not shown anything yet, nor does SSI intend to show anything until they have created an actual Machine God, but SSI says they can pull it off so it's all good to go ahead and wire the GDP of Norway directly to Ilya. | |||||||||||||||||||||||
| |||||||||||||||||||||||
| ▲ | giardini 2 hours ago | parent | prev | next [-] | ||||||||||||||||||||||
I'll be convinced LLMs are a reasonable approach to AI when an LLM can give reasonable answers after being trained with approximately the same books and classes in school that I was once I completed my college education. | |||||||||||||||||||||||
| ▲ | shwaj 11 hours ago | parent | prev | next [-] | ||||||||||||||||||||||
Are you asking whether the whole podcast can be boiled down to that translation, or whether you can infer/translate that from the title? If the former, no. If the latter, sure, approximately. | |||||||||||||||||||||||
| ▲ | Quothling 11 hours ago | parent | prev | next [-] | ||||||||||||||||||||||
Not really, but there is a finite amount of data to train models on. I found it rather interesting to hear him talk about how Gemini has been better at getting results out of the data than their competition, and how this is the first insights into a new way of dealing with how they train models on the same data to get different results. I think the title is an interesting thing, because the scaling isn't about compute. At least as I understand it, what they're running out of is data, and one of the ways they deal with this, or may deal with this, is to have LLM's running concurrently and in competition. So you'll have thousands of models competing against eachother to solve challenges through different approaches. Which to me would suggest that the need for hardware scaling isn't about to stop. | |||||||||||||||||||||||
| ▲ | 11 hours ago | parent | prev | next [-] | ||||||||||||||||||||||
| [deleted] | |||||||||||||||||||||||
| ▲ | imiric 11 hours ago | parent | prev [-] | ||||||||||||||||||||||
The translation to me is: this cow has run out of milk. Now we actually need to deliver value, or the party stops. | |||||||||||||||||||||||