Remix.run Logo
Chabsff 3 days ago

Ok, but how do you go about measuring whether a black-box is doing that or not?

We don't apply that criteria when evaluating animal intelligence. We sort of take it for granted that humans at large do that, but not via any test that would satisfy an alien.

Why should we be imposing white-box constraints to machine intelligence when we can't do so for any other?

deadbabe 3 days ago | parent [-]

There is truly no such thing as a “black box” when it comes to software, there is only a limit to how much patience a human will have in understanding the entire system in all its massive complexity. It’s not like an organic brain.

Chabsff 3 days ago | parent | next [-]

The black box I'm referring to is us.

You can't have it both ways. If your test for whether something is intelligent/thinking or not isn't applicable to any known form of intelligence, then what you are testing for is not intelligence/thinking.

holmesworcester 3 days ago | parent | prev | next [-]

You wouldn't say this about a message encrypted with AES though, since there's not just a "human patience" limit but also a (we are pretty sure) unbearable computational cost.

We don't know, but it's completely plausible that we might find that the cost of analyzing LLMs in their current form, to the point of removing all doubt about how/what they are thinking, is also unbearably high.

We also might find that it's possible for us (or for an LLM training process itself) to encrypt LLM weights in such a way that the only way to know anything about what it knows is to ask it.

mstipetic 3 days ago | parent | prev [-]

Just because it runs on a computer doesn’t mean it’s “software” in the common meaning of the word