Remix.run Logo
ivan_gammel 7 days ago

Depends on definition of “cross-domain”. Can any of the current models be plugged in in an emulator of a human with equivalent senses (vision, hearing etc) and process those inputs in real time with the same speed of reaction, i.e. emulate human or animal intelligence in interactions with environment? That would be truly cross-domain.

brookst 6 days ago | parent | next [-]

Sensor and motor handling are important but not necessarily equal to general intelligence. A person suffering locked-in syndrome is still intelligent; an anemone responding to stimulus with movement is not.

ivan_gammel 6 days ago | parent [-]

I‘m not saying it’s equal, just pointing out that it’s a good indicator of processing capacity that is necessary to match human intelligence. We are aware of the environment and learn continuously from interactions with it. It should be expected that AGI can do the same.

Your case with locked-in syndrome is interesting. Can human brain develop intelligence in full absence of any senses? I doubt so.

lostmsu 7 days ago | parent | prev [-]

I defined cross-domain with an example. ChatGPT is not trained to practice chemistry and law, yet it can do both. It is cross-domain.

You can make it stronger at being cross domain, but it satisfies the minimum requirement.

ivan_gammel 7 days ago | parent | next [-]

It cannot. It doesn’t reason. Gambling and winning (1-10^N)*100% times is not the same as reasoning and providing accurate answers the same amount of times. If you reason about something, your errors fall into certain categories of fallacies often related to incomplete information. LLM hallucinations are easy to spot with reasoning, they are of statistical nature.

lostmsu 7 days ago | parent [-]

> (1-10^N)*100% times

RugnirViking 7 days ago | parent | prev [-]

It is trained to practice chemistry and law. The reason it can do those because it's trained on an appreciable portion of all human output on both or those fields. If that's not training on them I don't know what is.

lostmsu 7 days ago | parent [-]

> It is trained to practice chemistry and law

No, it is not. It is trained to predict next token, and it is trained to follow user instructions.

RugnirViking 6 days ago | parent [-]

I mean, it a very literal sense, it is. Chemistry and law textbooks are in the training corpus. It is trained on these

lostmsu 5 days ago | parent [-]

This sounds backwards to me. The fact that I can show you a law book and you will be able to pass a law exam is a consequence of you being "generally intelligent". If I show these books to something less intelligent like a cow, or less general, like a CNN, they won't be able to do that.

RugnirViking 5 days ago | parent [-]

no, you're not understanding me. I'm not making any claim on whether they're any good at those subjects, merely that they are, objectively, trained on them

lostmsu 5 days ago | parent [-]

What is the relevance of this claim on its own if training on something alone is not sufficient for them to practice law, and you actually have to add cross-domain into picture for it to work, making it "general"?

I still disagree though, they are not trained to practice law. They are trained to remember law, but practice come from general training to follow instructions.