Remix.run Logo
staticassertion 5 hours ago

What do you mean exist outside of time? They definitely don't exist outside of any causal chain - tokens follow other tokens in order.

Gaps in which no processing occurs seems sort of irrelevant to me.

The main limitation I'd point to if I wanted to reject LLMs being conscious is that they're minimally recurrent if at all.

mrob 3 hours ago | parent | next [-]

Pseudocode for LLM inference:

    while (sampled_token != END_OF_TEXT) {
    probability_set = LLM(context_list)
    sampled_token = sampler(probability_set)
    context_list.append(sampled_token)
    }
LLM() is a pure function. The only "memory" is context_list. You can change it any way you like and LLM() will never know. It doesn't have time as an input.
staticassertion 35 minutes ago | parent [-]

As opposed to what? There are still causal connections, which feel sufficient. A presentist would reject the concept of multiple "times" to begin with.

felipeerias 4 hours ago | parent | prev [-]

A LLM is not intrinsically affected by time. The model rests completely inert until a query comes in, regardless of whether that happens once per second, per minute, or per day. The model is not even aware of these gaps unless that information is provided externally.

It is like a crystal that shows beautiful colours when you shine a light through it. You can play with different kinds of lights and patterns, or you can put it in a drawer and forget about it: the crystal doesn’t care anyway.

staticassertion 34 minutes ago | parent [-]

So what? If a human were unconscious every 5 seconds for 100ms, would you say they are "less conscious"? Tokens are still causally connected, which feels sufficient.