Remix.run Logo
pixl97 13 hours ago

Skeptics always like to toss in 'if ever' as some form of enlightenment they they are aware of some fundamental limitation of the universe only they are privy to.

falseprofit 12 hours ago | parent | next [-]

Let’s say there are three options: {soon, later, not at all}. Ruling out only one to arrive at {later, not at all} implies less knowledge than ruling out two and asserting {later}.

Awareness of a fundamental limitation would eliminate possibilities to just {not at all}, and the phrasing would be “never”, rather than “not soon, if ever”.

pixl97 9 hours ago | parent [-]

But we know that the fundamental limitation of intelligence does not exist, nature has already created that with animal and eventually human intelligence via random walk. So 'AI will never exist' is lazy magical thinking. That intelligence can be self reinforcing is a good reason why AI will exist much sooner than later.

falseprofit 6 hours ago | parent [-]

I actually agree with your premise of ruling out “not at all”. I was just responding to your characterization of “if ever”.

I’m not quite as certain as you are, though. Just because a technology is possible does not mean it is inevitable.

mzajc 13 hours ago | parent | prev | next [-]

Of the universe, perhaps, but humans certainly are a limiting factor here. Assuming we get this technology someday, why would one buy your software when the mere description of its functionality allows one to recreate it effortlessly?

pixl97 9 hours ago | parent [-]

>humans certainly are a limiting factor here.

Completely disagree. Intelligence is self reinforcing. The smarter we get as humans the more likely we'll create sources of intelligence.

madeofpalk 9 hours ago | parent | prev [-]

Theorising something will exist before the heat death of the universe isn’t really interesting.