| ▲ | estearum 2 hours ago | |
> If something claims to be conscious and we can't prove it's not, we have no choice but to believe it. This is not a good test. A dog won't claim to be conscious but clearly is, despite you not being able to prove one way or the other. GPT-3 will claim to be conscious and (probably) isn't, despite you not being able to prove one way or the other. | ||
| ▲ | dullcrisp an hour ago | parent [-] | |
An LLM will claim whatever you tell it to claim. (In fact this Hacker News comment is also conscious.) A dog won’t even claim to be a good boy. | ||