This is really cool. Any chance you can make the backend selfhostable with ollama as an option for the LLM?