Remix.run Logo
replete 4 hours ago

Run server with ollama, use Continue extension configured for ollama

BoredomIsFun 3 hours ago | parent [-]

I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible.