▲ | tracerbulletx 2 days ago | |
Obviously LLMs are missing many important properties of the brain like spatial, time, and chemical factors, as well as many different inter connected feedback networks to different types of neural networks that go well beyond what llms do. Beyond that, they are the same thing. Signal Input -> Signal Output I do not know what consciousness actually is so I will not speak to what it will take for a simulated intelligence to have one. Also I never used the word believes, I said convinced, if it helps I can say "acted in a way as if it had high confidence in its output" | ||
▲ | cratermoon a day ago | parent [-] | |
Obviously sand is missing many important properties of integrated circuits, like semiconductivity, electric interconnectivity, transistors, and p-n junctions. Beyond that, they are the same thing. |