Remix.run Logo
czl 2 days ago

I get the worry. AFAIK most of the current capex is going into scalable parallel compute, memory, and networking. That stack is pretty model agnostic, similar to how all that dot com fiber was not tied to one protocol. If transformers stall, the hardware is still useful for whatever comes next.

On reasoning, I see LLMs and classic algorithms as complements. LLMs do robust manifold following and associative inference. Traditional programs do brittle rule following with guarantees. The promising path looks like a synthesis where models use tools, call code, and drive search and planning methods such as MCTS, the way AlphaGo did. Think agentic systems that can read, write, execute, and verify.

LLMs are strongest where the problem is language. Language co evolved with cognition as a way to model the world, not just to chat. We already use languages to describe circuits, specify algorithms, and even generate other languages. That makes LLMs very handy for specification, coordination, and explanation.

LLMs can also statistically simulate algorithms, which is useful for having them think about these algorithms. But when you actually need the algorithm, it is most efficient to run the real thing in software or on purpose built hardware. Let the model write the code, compose the tools, and verify the output, rather than pretending to be a CPU.

To me the risk is not that LLMs are a dead end, but that people who do not understand them have unreasonable expectations. Real progress looks like building systems that use language to invent and implement better tools and route work to the right place. If a paper lands tomorrow that shows pure next token prediction is not enough for formal reasoning, that would be an example of misunderstanding LLMs, not a stop sign. We already saw something similar when Minsky and Papert highlighted that single layer perceptrons could not represent XOR, and the field later moved past that with multilayer networks. Hopefully we remember that and learn the right lesson this time.