Remix.run Logo
threethirtytwo 18 hours ago

[flagged]

16 hours ago | parent | next [-]
[deleted]
undeveloper 18 hours ago | parent | prev | next [-]

come out of the irony layer for a second -- what do you believe about LLMs?

jorvi 16 hours ago | parent | prev | next [-]

I mean.. LLMs have hit a pretty hard wall a while ago, with the only solution being throwing monstrous compute at eking out the remaining few percent improvement (real world, not benchmarks). That's not to mention hallucinations / false paths being a foundational problem.

LLMs will continue to get slightly better in the next few years, but mainly a lot more efficient. Which will also mean better and better local models. And grounding might get better, but that just means less wrong answers, not better right answers.

So no need for doomerism. The people saying LLMs are a few years away from eating the world are either in on the con or unaware.

7777332215 18 hours ago | parent | prev | next [-]

If all of it is going away and you should deny reality, what does everything else you wrote even mean?

habinero 18 hours ago | parent | prev [-]

Yes, it is simply impossible that anyone could look at things and do your own evaluations and come to a different, much more skeptical conclusion.

The only possible explanation is people say things they don't believe out of FUD. Literally the only one.