Remix.run Logo
mjhay 5 hours ago

Very much so. It was a little less untrue with older word embedding models* but that kind of semantic linearity never was a thing in practice. Word embedding models try to embed semantically similar words close to each other, but that does not imply linearity at all.

*with transformer models, it is pretty much not even wrong.