▲ | 3abiton 10 days ago | |
Depends what is your setup? You can always find more support on r/Localllama | ||
▲ | WillAdams 9 days ago | parent [-] | |
Using Copilot, and currently running jan.ai --- /r/Localllama seems to tend towards the typical Reddit cesspool. Let me rephrase: What locally-hosted LLM would be suited to batch processing image files? |