Remix.run Logo
parliament32 21 hours ago

Why would a text generator ever be conscious? Was this really worth writing a paper about?

ikekkdcjkfke 19 hours ago | parent | next [-]

Animals are also next token/action generators, and we also think (simulating a string of events). Maybe humans are better at grouping these events into more powerful network activations to retrieve better results

11 hours ago | parent | next [-]
[deleted]
RaftPeople 19 hours ago | parent | prev [-]

> Animals are also next token/action generators

But for humans, the concept/thought/idea/action is formed first and then a sequence of tokens are generated to communicate that concept/thought/idea/action.

blueplanet200 18 hours ago | parent [-]

And a lot of GPU cycles happen before next token prediction, what's your point?

cma 20 hours ago | parent | prev [-]

I think gpt-image-2 at least incorporates representations from the base model, even if base model doesn't itself have the output capability. And it does have image input fused directly into it that helps make those representations more usable for image gen, so it's not just generating text.