Remix.run Logo
afavour 4 hours ago

Sure, when you get rid of the timelines and the methods we'll use to get there, everyone agrees on everything. But at that point it means nothing. Yeah, AGI is possible (say the people who earn a salary based on that being true). Curing all known diseases is possible too. How will we do that? Oh, I don't know. But it's a thing that could possibly happen at some point. Give me some investment cash to do it.

If you claim "AGI is possible" without knowing how we'll actually get there you're just writing science fiction. Which is fine, but I'd really rather we don't bet the economy on it.

ACCount37 3 hours ago | parent | next [-]

I could claim "nuclear weapons are possible" in year 1940 without having a concrete plan on how to get there. Just "we'd need a lot of U235 and we need to set it off", with no roadmap: no "how much uranium to get", "how to actually get it", or "how to get the reaction going". Based entirely on what advanced physics knowledge I could have had back then, without having future knowledge or access to cutting edge classified research.

Would not having a complete foolproof step by step plan to obtaining a nuclear bomb somehow make me wrong then?

The so-called "plan" is simply "fund the R&D, and one of the R&D teams will eventually figure it out, and if not, then, at least some of the resources we poured into it would be reusable elsewhere". Because LLMs are already quite useful - and there's no pathway to getting or utilizing AGI that doesn't involve a lot of compute to throw at the problem.

afavour an hour ago | parent | next [-]

I think you're falling victim to survivorship bias there, or something like it.

In 1940 I might have said "fusion power is possible" based entirely on what advanced psychics knowledge I had. And I would have been correct, according to the laws of physics it is possible. We still don't have it though. When watching Neil Armstrong walk on the moon I might have said "moon colonies are possible", and I'd have been right there too. And yet...

AntiDyatlov 3 hours ago | parent | prev [-]

In the case of nuclear weapons, we had a theory that said they were possible. We don't have a theory that says AGI or ASI is possible. It's a big difference.

adrianN 4 hours ago | parent | prev [-]

There are plenty of people that argue that you need nontechnological pixi dust for intelligence.

ACCount37 4 hours ago | parent | next [-]

Yes, quite unfortunately. That reeks to me of wishful thinking.

Maybe that was a sensible thing to think in 1926, when the closest things we had to "an artificial replica of human intelligence" was the automatic telephone exchange and the mechanical adding machine. But knowledge and technology both have advanced since.

Now, we're in 2026, and the list of "things that humans can do but machines can't" has grown quite thin. "Human brain is doing something truly magical" is quite hard to justify on technical merits, and it's the emotional value that makes the idea linger.

dirkc 2 hours ago | parent | prev [-]

There are also people who think there might be emergent behavior at play that would require extremely high fidelity simulation to achieve.

Also, the real thing (intelligence) as it is currently in operation isn't that well understood