Remix.run Logo
ausbah a day ago

aren’t outputs literally conditioned on prior textual context? how is that lacking interdependence?

isn’t learning the probabilistic relationships between tokens an attempt to approximate those exact semantic relationships between words?

mallowdram a day ago | parent [-]

Interdependence takes into account the Universe for each thing or idea. There is no such thing as probabilistic in a healthy mind. A probabilistic approach is unhealthy.

https://pubmed.ncbi.nlm.nih.gov/38579270/

edit: looking into this, this is likely in terms of the brain and arbitrariness highly paradoxical even oxymoronic

>>isn’t learning the probabilistic relationships between tokens an attempt to approximate those exact semantic relationships between words?

This is really a poor manner of resolving the conduit metaphor condition to arbitary signals, to falsify them as specific, which is always impossible. This is simple linguistic via animal signal science. If you can't duplicate any response with a high degreee of certainty from output, then the signal is only valid in the most limited time-space condition and yet it is still arbitrary. CS has no understanding of this.