Remix.run Logo
deadbabe 2 days ago

The human believes it was right.

The LLM doesn’t believe it was right or wrong. It doesn’t believe anything anymore than a mathematical function believes 2+2=4.

tracerbulletx 2 days ago | parent | next [-]

Obviously LLMs are missing many important properties of the brain like spatial, time, and chemical factors, as well as many different inter connected feedback networks to different types of neural networks that go well beyond what llms do.

Beyond that, they are the same thing. Signal Input -> Signal Output

I do not know what consciousness actually is so I will not speak to what it will take for a simulated intelligence to have one.

Also I never used the word believes, I said convinced, if it helps I can say "acted in a way as if it had high confidence in its output"

cratermoon a day ago | parent [-]

Obviously sand is missing many important properties of integrated circuits, like semiconductivity, electric interconnectivity, transistors, and p-n junctions.

Beyond that, they are the same thing.

istjohn 2 days ago | parent | prev [-]

Can you support that assertion? What's your evidence?

cratermoon a day ago | parent [-]

not the OP but https://www.tandfonline.com/doi/abs/10.1080/0951508070123951...