▲ | GuB-42 5 days ago | |
> What is it like to be an LLM? That's a question I actually asked myself. From the point of view of a LLM, words are everything. We have hands, bats have echolocation, and LLMs have words, just words. How does a LLM feel when two words match perfectly? Are they hurt by typos? It may feel silly to give LLMs consciousness, I mean, we know how they work, this is just a bunch of matrix operations. But does it mean it is not conscious? Do things stop being conscious once we understand them? For me, consciousness is like a religious belief. It is unfalsifiable, unscientific, we don't even have a precise definition, but it is something we feel deep inside of us, and it guides our moral choices. | ||
▲ | Mumps 5 days ago | parent | next [-] | |
You activated a memory of a passage in one of my favourite books ( Blindsight, Peter Watts. it's amazing and free online): I await further instructions. They arrive 839 minutes later, and they tell me to stop studying comets immediately. I am to commence a controlled precessive tumble that sweeps my antennae through consecutive 5°-arc increments along all three axes, with a period of 94 seconds. Upon encountering any transmission resembling the one which confused me, I am to fix upon the bearing of maximal signal strength and derive a series of parameter values. I am also instructed to retransmit the signal to Mission Control. I do as I'm told. For a long time I hear nothing, but I am infinitely patient and incapable of boredom. | ||
▲ | edgineer 5 days ago | parent | prev | next [-] | |
Not words. Tokens. | ||
▲ | nurettin 5 days ago | parent | prev [-] | |
> Are they hurt by typos? I've been thinking about that. Would they perform worse if I misspell a word along the way? It looks like even the greatest models of 2025 are utterly confused by everything when you introduce two contradicting requirements, so they definitely "dislike" that. |