Remix.run Logo
sublinear 3 days ago

Way out of touch.

AGI is poorly defined and thus is a science "problem", and a very low priority one at that.

No amount of engineering or model training is going to get us AGI until someone defines what properties are required and then researches what can be done to achieve them within our existing theories of computation which all computers being manufactured today are built upon.

karmakaze 3 days ago | parent | next [-]

This is the funniest "I'm a hammer thus AGI is a nail" post I've ever read.

sublinear 3 days ago | parent [-]

Maybe I'm misunderstanding what you mean by that, but do you have any examples of software engineering that weren't already thoroughly explained by computer science long before?

karmakaze 3 days ago | parent [-]

By this, I meant the original post.. in agreement.

Loughla 3 days ago | parent | prev [-]

It strikes me that until we fully understand human consciousness, we don't stand a chance of reaching AGI.

Am I incorrect?

sublinear 3 days ago | parent | next [-]

I think we can relax that a bit. We "just" need to understand some definition of cognition that satisfies our computational needs.

Natural language processing is definitely a huge step in that direction, but that's kinda all we've got for now with LLMs and they're still not that great.

Is there some lower level idea beneath linguistics from which natural language processing could emerge? Maybe. Would that lower level idea also produce some or all of the missing components that we need for "cognition"? Also a maybe.

What I can say for sure though is that all our hardware operates on this more linguistic understanding of what computation is. Machine code is strings of symbols. Is this not good enough? We don't know. That's where we're at today.

kelnos 3 days ago | parent | prev | next [-]

Unclear. You might be right, but I think it's possible that you're also wrong.

It's possible to stumble upon a solution to something without fully understanding the problem. I think this happens fairly often, really, in a lot of different problem domains.

I'm not sure we need to fully understand human consciousness in order to build an AGI, assuming it's possible to do so. But I do think we need to define what "general intelligence" is, and having a better understanding of what in our brains makes us generally intelligent will certainly help us move forward.

root_axis 3 days ago | parent | prev | next [-]

That doesn't seem like a useful assumption since consciousness doesn't have a functional definition (even though it might have a functional purpose in humans)

mdp2021 3 days ago | parent | prev [-]

Intelligence (solving problems) does not require consciousness.

Loughla 3 days ago | parent [-]

Please elaborate.

kelnos 3 days ago | parent | next [-]

I'm not sure I'd give such an absolute statement of certainty as the GP, but there is little reason to believe that consciousness and intelligence need to go hand-in-hand.

On top of that, we don't really have good, strong definitions of "consciousness" or "general intelligence". We don't know what causes either to emerge from a complex system. We don't know if one is required to have the other (and in which direction), or if you can have an unintelligent consciousness or an unconscious intelligence.

sowbug 3 days ago | parent | prev | next [-]

https://en.wikipedia.org/wiki/Chinese_room

mdp2021 2 days ago | parent | prev [-]

You do not need to implement consciousness into a calculator. There exist forms of intelligence that are just sophisticated calculation - no need for consciousness.