Remix.run Logo
robotswantdata 3 hours ago

Why are you using Ollama? Just use llama.cpp

brew install llama.cpp

use the inbuilt CLI, Server or Chat interface. + Hook it up to any other app

Bigsy 2 hours ago | parent [-]

For MLX I'd guess.

wronglebowski an hour ago | parent | next [-]

That also comes upstream from llama.cpp https://github.com/ggml-org/llama.cpp/discussions/4345

redrove an hour ago | parent | prev [-]

https://omlx.ai/