| ▲ | ekidd 10 hours ago | |
> I trust none of us would presume that the decentralized labor of pen & paper calculations somehow instantiated a “psychology” in the sense of a mind experiencing various levels of despair Your argument is based on an appeal to intuition. But the scenario that you ask people to imagine is profoundly misleading in scale. Let's assume a modern frontier model, around 1 trillion parameters. Let's assume that the math is being done by an immortal monk, who can perform one weight's calculations per second. The monk will generate the first "token", about 4 characters, in 31,688 years. In a bit over 900,000 years, the immortal monk will have generated a single Tweet. At that point, I no longer have any intuition. The sort of math I could do by hand in a human lifetime could never "experience" anything. But I can't rule out the possibility that 900,000 years of math might possibly become a glacial mind, expressing a brief thought across a time far greater than the human species has existed. As the saying goes, sometimes quantity has a quality all its own. (This is essentially the "systems response" to Searle's "Chinese room" argument. It's a old discussion.) | ||
| ▲ | buu700 9 hours ago | parent [-] | |
I don't personally believe LLMs are sentient, but I've always enjoyed this thought experiment: https://xkcd.com/505. I have a signed copy framed on my wall. | ||