| ▲ | gabriel666smith 3 days ago | |
This is a great example, and not odd as an analogy at all. It surfaces something subtle. Language architecture is really interesting, I think, for programmers who have bought into the LLM hype in any meaningful way. It's an important field to have a sense of. Tokenizers, for example, generally have multi-syllabic tokens as their base-level, indivisible unit. You rarely see this mentioned when LLM capability against non-coding tasks is discussed, despite it being deeply important for prose construction. Not to mention, putting language models aside, that the vast majority of code is written in language with a logical grammar. The disciplines are highly linked. | ||
| ▲ | regularfry 2 days ago | parent | next [-] | |
The AI generated front page of HN posted yesterday had some generated comments in at least one of the threads that scanned and rhymed. It's clearly there in whatever model that was, and while it might just have been a confluence of having seen a specific word pair a certain distance apart in the learning data to account for the rhyming, I'm having a hard time explaining away the construction of a coherent meter. | ||
| ▲ | altairprime 2 days ago | parent | prev [-] | |
The Judoon have such a lovely language, though! | ||