▲ | Adityav369 3 days ago | |
You can run this fully locally using Ollama for inference, although you'll need larger models and a beefy machine for great results. On my end llama 3.2 8B does a good job on technical docs, but bigger the better lol. | ||
▲ | thot_experiment 3 days ago | parent | next [-] | |
Ahh, I didn't see that, I just saw them talking about a free tier or whatever and my eyes glazed over. I'll try it out with Mistral-small 3.1 at some point tonight, I've been having really great results with it's multimodal understanding. | ||
▲ | mrtimo 3 days ago | parent | prev [-] | |
how would you use this within open-web-ui locally? |