Remix.run Logo
calf a day ago

Well, previously you said that it (presumably "it" broadly refers to spatial reasoning AI) is a "high dimensional complex type cutting problem".

You said this is obvious once explained. I don't see this as obvious, rather, I see this as begging the question--the research program you were secretly involved in wanted to parallelize the engineering of it so obviously they needed some fancy "cutting algorithm" to make it possible.

The problem is that this conflated the scientific statement of what "spatial reasoning" is. There's no obvious explanation why spatial reasoning should intuitively be some kind of cutting problem however you wish to define or generalize a cutting problem. That's not how good CS research is done or explained.

In fact I could (mimicking your broad assertions) go so far as to claim, the project was doomed to fail because they weren't really trying to understand something, they want to make something without understanding it as the priority. So they were constrained by the parallel technology that they had at the time, and when the computational power available didn't pan out they reached a natural dead end.