▲ | PaulHoule 7 days ago | |
My guess is there is a cost-capability tradeoff such that the O(N^2) really is buying you something you couldn't get for O(N). Behind that, there really are intelligent systems problems that boil down to solving SAT and should be NP-complete... LLMs may be able to short circuit those problems and get lucky guesses quite frequently, maybe the 'hallucinations' won't go away for anything O(N^2). |