Remix.run Logo
jedisct1 6 hours ago

Really cool.

But how to use it instead of Copilot in VSCode ?

replete 3 hours ago | parent | next [-]

Run server with ollama, use Continue extension configured for ollama

BoredomIsFun 3 hours ago | parent [-]

I'd stay away from ollana, just use llama.cpp; it is more up date, better performing and more flexible.

flanked-evergl 4 hours ago | parent | prev [-]

Would love to know myself, I recall there was some plugin for VSCode that did next edits that accepted a custom model but I don't recall what it was now.