Remix.run Logo
crazygringo 3 days ago

Sounds like it's just a question of insufficient training material or training material that is insufficiently annotated.

There's no reason an LLM shouldn't be able to produce such poetry. Remember that extensive "thinking" occurs before producing the first output token -- LLM's aren't blindly outputting tokens without first knowing where they are going. But it would make sense that this is an area current companies have not prioritized for training. Not that many people need new poetry in a dead language...