Remix.run Logo
_fizz_buzz_ a day ago

> Basically an LLM translating across languages will "light up" (to use a rough fMRI equivalent) for the same concepts (e.g. bigness) across languages.

That doesn't seem surprising at all. My understanding is that transformers where invented exactly for the application of translations. So, concepts must be grouped together in different languages. That was originally the whole point and then turned out to be very useful for broader AI applications.