Remix.run Logo
3371 2 hours ago

I totally agree, and I was fully aware of how common people make language for fun when I replied.

But I feel like the rationale would still stands: Considering LLMs' natures, common boilerplate tasks are easy because they can kind of just "decompress" from training data. But for a new language design, unless the language is almost identical to some other captured by the model, "decompression" would just fail.