| ▲ | pllbnk 2 hours ago | |||||||||||||||||||||||||
I think it’s a mistake to think that we will be blindly going in this direction for many years and then suddenly collectively wake up and realize what have we done. It’s a great filter and a great opportunity. If LLMs stop improving at the pace of the last few years (I believe they already are slowing down) then they will still manage to crank out billions lines of code which they themselves won’t be able to grep and reason through, leading to drop in quality and lost revenue for the companies that choose to go all-in with LLMs. But let’s be realistic - modern LLMs are still a great and useful tool when used properly so they will stay. Our goal will be to keep them on track and reduce the negative impact of hallucinations. As a result software industry will move away from large complex interconnected systems that have millions of features but only a few of them actively used, to small high quality targeted tools. Because their work will be easier to verify and to control the side effects. | ||||||||||||||||||||||||||
| ▲ | lelanthran 2 hours ago | parent | next [-] | |||||||||||||||||||||||||
> If LLMs stop improving at the pace of the last few years (I believe they already are slowing down) Depending on how you measure "improvement" they already have or they never will :-/ Measuring capability of the model as a ratio of context length, you reach the limits at around 300k-400k tokens of context; after that you have diminishing returns. We passed this point. Measuring capability purely by output, smarter harnesses in the future may unlock even more improvements in outputs; basically a twist on the "Sufficiently Smart Compiler" (https://wiki.c2.com/?SufficientlySmartCompiler=) That's the two extremes but there's more on the spectrum in between. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
| ▲ | leptons 2 hours ago | parent | prev [-] | |||||||||||||||||||||||||
I wish I got to hallucinate at work, and just get a pat on the head for constantly doing the wrong thing. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||