▲ | deadbabe 2 days ago | |||||||
The human believes it was right. The LLM doesn’t believe it was right or wrong. It doesn’t believe anything anymore than a mathematical function believes 2+2=4. | ||||||||
▲ | tracerbulletx 2 days ago | parent | next [-] | |||||||
Obviously LLMs are missing many important properties of the brain like spatial, time, and chemical factors, as well as many different inter connected feedback networks to different types of neural networks that go well beyond what llms do. Beyond that, they are the same thing. Signal Input -> Signal Output I do not know what consciousness actually is so I will not speak to what it will take for a simulated intelligence to have one. Also I never used the word believes, I said convinced, if it helps I can say "acted in a way as if it had high confidence in its output" | ||||||||
| ||||||||
▲ | istjohn 2 days ago | parent | prev [-] | |||||||
Can you support that assertion? What's your evidence? | ||||||||
|