▲ | naasking 5 days ago | ||||||||||||||||
> Powerful pattern matching is still just pattern matching. Higher order pattern matching is Turing complete. Transformers are Turing complete. Memory augmented LLMs are Turing complete. Neural networks can learn to reproduce any function. These have all been proven. So if computers can be intelligent and can solve novel problems in principle, then LLMs can too if given the right training. If you don't think computers can be intelligent, you have a much higher burden to meet. > Human level reasoning includes ability to learn, so that people can solve novel problems, overcome failures by trial and error, exploration, etc. You keep bringing this up as if it's lacking, but basically all existing LLM interfaces provide facilities for memory to store state. Storing progress just isn't an issue if the LLM has the right training. HN has some recent articles about Claude code just being given the task to port some GitHub repos to other programming languages, and they woke up the next morning and it did it autonomously, using issue tracking, progress reports, PRs the hole nine yards. This is frankly not the hard part IMO. | |||||||||||||||||
▲ | HarHarVeryFunny 5 days ago | parent [-] | ||||||||||||||||
Being Turing machine complete means that the system in question can emulate a Turing machine, which you could then program to do anything since it's a universal computer. So sure, if you know how to code up an AGI to run on a Turing machine you would be good to go on any Turing machine! I'm not sure why you want to run a Turing machine emulator on an LLM, when you could just write a massively faster one to run on the computer your LLM is running on, cutting out the middle man, but whatever floats your boat I suppose. Heck, if you really like emulation and super slow speed then how about implementing Conway's game of Life to run on your LLM Turing machine emulator, and since Life is also Turing complete you could run another Turing machine emulator on that (it's been done), and finally run your AGI on top of that! Woo hoo! I do think you'll have a challenge prompting your LLM to emulate a Turing machine (they are really not very good at that sort of thing), especially since the prompt/context will also have to do double duty as the Turing machines (infinite length) tape, but no doubt you'll figure it out. Keep us posted. I'll be excited to see your AGI program when you write that bit. | |||||||||||||||||
|