Remix.run Logo
adam_patarino 5 hours ago

Or you could use a local model where you’re not constrained by tokens. Like rig.ai

dostick 4 hours ago | parent [-]

How is your offering different from local ollama?

adam_patarino 2 hours ago | parent [-]

Its batteries included. No config.

We also fine tuned and did RL on our model, developed a custom context engine, trained an embedding model, and modified MLX to improve inference.

Everything is built to work with each other. So it’s more like an apple product than Linux. Less config but better optimized for the task.