| ▲ | vidarh 7 hours ago | |
Context effectively provifes an IO port, and so all the loop needs to do is to simulate the tape head, and provide a single token of state. You can not be convinced Turing complete is relevant all you want - we don't know of any more expansive category of computable functions, and so given that an LLM in the setup described is Turing complete no matter that they aren't typically deployed that way is irrelevant. They trivially can be, and that is enough to make the shallow dismissal of pointing out they're "just" predicting the next token meaningless. | ||