Remix.run Logo
tmn 2 days ago

There’s a significant difference between predicting what it will specifically look like, and predicting sets of possibilities it won’t look like

kragen 2 days ago | parent [-]

No, there isn't. When speaking of logically consistent possibilities, the two problems are precisely isomorphic under Boolean negation.

bryanrasmussen 2 days ago | parent [-]

good point, someone recently said

> Five years from now AI might still break down at even a small bit of complexity, or it might be installing air conditioners, or it might be colonizing Mercury and putting humans in zoos.

do all these seem logically consistent possibilities to you?

kragen 2 days ago | parent [-]

Yes, obviously. You presumably don't know what "consistent" means in logic, and your untutored intuition is misleading you into guessing that possibilities like those could conceivably be inconsistent.

https://en.m.wikipedia.org/wiki/Consistency

bryanrasmussen 2 days ago | parent [-]

or I just wanted to make sure that you were adamant that the list of those three possibilities were equally probable, to reiterate

> AI might still break down at even a small bit of complexity, or it might be installing air conditioners, or it might be colonizing Mercury and putting humans in zoos.

that each of these things, being logically consistent, have equal chances of being the case 5 years from now?

kragen 2 days ago | parent [-]

No. Fuck off. There's no uniform probability distribution over the reals, so stop trying to put bullshit in my mouth.

bryanrasmussen 2 days ago | parent [-]

OK well you obviously seem to be having some bad time about something in your life right now so I won't continue, other than to note the comment that started this said

>There’s a significant difference between predicting what it will specifically look like, and predicting sets of possibilities it won’t look like

which I took to mean there are probability distributions around what things will happen, and it seemed to be your assertion that there wasn't, that a number of things only one of which seemed especially probable, were equally probable. I'm glad to learn you don't think this as it seems totally crazy, especially for someone praising LLMs which after all spend their time making millions of little choices based on probability.