▲ | dgfitz 4 days ago | |
> Consider this thought exercise: if we were to ever do an upload of a human mind, and it was executing on silicon, would they not be experiencing feelings because their thoughts are provably a deterministic calculation? You just said “consider this impossibility” as if there is any possibility of it happening. You might as well have said “consider traveling faster than the speed of light” which sure, fun to think about. We don’t even know how most of the human brain even works. We throw pills at people to change their mental state in hopes that they become “less X” or “more Y” with a whole list of caveats like “if taking pill reduce X makes you _more_ X, stop taking it” because we have no idea what we’re doing. Pretending we can use statistical models to create a model that is capable of truly unique thought… stop drinking the kool-aid. Stop making LLMs something they’re not. Appreciate them for what they are, a neat tool. A really neat tool, even. This is not a valid thought experiment. Your entire point hinges on “I don’t believe in souls” which is fine, no problem there, but it does not a valid point make. |