Remix.run Logo
syntaxing 6 hours ago

Tim Dettmers had an interesting take on this [1]. Fundamentally, the philosophy is different.

>China’s philosophy is different. They believe model capabilities do not matter as much as application. What matters is how you use AI.

https://timdettmers.com/2025/12/10/why-agi-will-not-happen/

woeirua 5 hours ago | parent | next [-]

Sorry, but that's an exceptionally unimpressive article. The crux of his thesis is:

>The main flaw is that this idea treats intelligence as purely abstract and not grounded in physical reality. To improve any system, you need resources. And even if a superintelligence uses these resources more effectively than humans to improve itself, it is still bound by the scaling of improvements I mentioned before — linear improvements need exponential resources. Diminishing returns can be avoided by switching to more independent problems – like adding one-off features to GPUs – but these quickly hit their own diminishing returns.

Literally everyone already knows the problems with scaling compute and data. This is not a deep insight. His assertion that we can't keep scaling GPUs is apparently not being taken seriously by _anyone_ else.

syntaxing 5 hours ago | parent | next [-]

Was more mentioning the article about the economic aspect of China vs US in terms of AI.

While I do understand your sentiment, it might be worth noting the author is the author of bitandbytes. Which is one of the first library with quantization methods built in and was(?) one of the most used inference engines. I’m pretty sure transformers from HF still uses this as the Python to CUDA framework

qprofyeh 5 hours ago | parent | prev [-]

There are startups in this space getting funded as we speak: https://olix.com/blog/compute-manifesto

re-thc 5 hours ago | parent | prev [-]

When you have export restrictions what do you expect them to say?

> They believe model capabilities do not matter as much as application.

Tell me their tone when their hardware can match up.

It doesn't matter because they can't make it matter (yet).