It would be incredibly interesting to see how LLM code generation would hook into this.
This is a paper about Chat LLMs in Hazel: https://arxiv.org/abs/2409.00921
llm hole filling ala the paper is actually live in the dev version right now (if you enter an openrouter API key in the second sidebar tab). it's slow and buggy at the moment though, it's only been running at all for the last few days