| ▲ | cstaszak 4 hours ago | |
I'm a fan of "civilization in a box" kinds of projects. However the ZIM file format leaves a lot to be desired in 2026. I've been exploring a refreshed, alternative approach: https://github.com/stazelabs/oza I do think having an LLM as an optional "sidecar" is a useful approach. If you can run a meaningful Ollama instance alongside your content, great! | ||
| ▲ | codeveil 2 hours ago | parent [-] | |
ZIM or not, I think the “LLM as optional sidecar” part is the right idea. The durable asset is the knowledge base itself. A local model can be useful on top, but it should stay a layer, not become the dependency. | ||